Sunday, February 26, 2012

Redirect-Cameroon

I just finished reading OpenDNS take on how to handle the Cameroon (Cameroun) domain which was the idea that you take the ".cm" and convert it to ".com".  Now with hind-sight I think they tried to simplify what they are really doing and that isn't what they are doing at all.  Here is the OpenDNS blog on that and other items concerning the various new domains.

OpenDNS new TLD ideas

I wrote my own comments on the idea of doing that conversion saying I didn't think it was a good one.  The very first thing that comes to mind is this one:

http://www.google.cm/

You can set it use the Fran├žais (French) language.  But I analyzed the cm TLD via two lists.  The first was for the malware coming out of the hosts that MalwareDominList.com (hereafter referred to as just MDL) has blocked over a 3-1/2 year period.  The other was to look for typo squatters with the hpHosts list which I last updated in 2012-01-07 (yyyy-mm-dd), mostly to remove old stale entries from my own block list.  Here are the results:

http://www.SecureMecca.com/public/CameroonDomain.txt

I had only one contention with one host in the domain that MDL removed and will add it back in along with several other hosts it redirected to.  It was the starting of a new browser window in violation of pop-up control that is earning them their block again.  Is there malware at the end of the tunnel?  I didn't find any but the recursively trying over and over via the first route it sent me down makes me want to investigate it again.  It is just that the second time I went to the host it sent me down another redirection channel. That and where they headed me off to earned all of them new blocks.

The most surprising thing was just how few typo-squatters hpHosts had.  There may be more, but unless somebody can give me a list of the ones I don't have, just a redirection of only those hosts (a block is really just a redirect to 127.0.0.1 or 0.0.0.0) that are typo-squatting should be done, not the entire domain.  Like I posted in the OpenDNS blog, the CM TLD is not a general DNSWCD (DNS WildCard Domain) even though it does have some elements of that.  It is also not a completely redirect service like the TK TLD.  Here is proof and I used DNS servers other than those from OpenDNS to make sure the results are not part of the redirection:

Host sdkjfjasdkfk.cm not found: 3(NXDOMAIN)

sdkjfjasdkfk.biz.cm is an alias for ghs.l.google.com.
ghs.l.google.com has address 74.125.127.121

lsdkfjdaslfl.bz.cm has address 64.207.157.231

If the latter two is what OpenDNS is referring to then I apologize.  By that I mean that they give the real IP address for known hosts and something else for everything else including some like these I used here which are specious. But if you ask me, it may be just best to not return anything at all for something that is not known.  I get really tired of browsers and DNS servers that feel they have to pound through to something, anything!  There are times you should just give up and fail!  Then display the failure message.  I really would much rather get a failure for me mistyping the domain and using "cm" rather than replacing the "cm" with "com".  Why?  While I was doing this, over and over I caught myself typing "com" rather than "cm" when I intentionally wanted to use the latter.  Since "com" represents most of the Internet I cannot even remember one instance where I have accidentally dropped the "o" from ".com".

All of this brings up gripes that I have.  Comcast says something is bad via their anti-bot service.  The browser says something is bad.  DNS services say something is bad.  Even the Firefox browser goes up to a reporting site in many situations.  Where are these servers getting their lists of bad hosts from?  They are getting them from people like me.  We find them.  It is okay if you block them for others but do not get in our way intending to protect us when we are attempting to see if the threat is still there.  This is especially true since I don't work from Windows.  I work from Linux.  I may do a test from time to time on Windows, but only after I am pretty sure the threat is gone first on Linux.  How often do I do those tests on Windows?  I do them less than 2-3 times per year.  But if you don't get out of the way there is no way for us to probe whether the threat is still there or if it is now gone.  If we retained everything there is no way for us to delist hosts that just had vulnerable web servers and are now clean.  Those people want their web-servers delisted ASAP.  If we don't delist them we would soon have so many hosts that are no longer a threat and perhaps less than 5% of them would have a threat any more.  It would be like the little boy crying wolf when there is none.  We are getting into a catcha-22 situation here.

Oh yes, I already block the new ".xxx" domain in the PAC filter with this rule:

BadHostParts[i++] = "xxx";

It is a malware rule, not a porn rule  I have white-listed most places where there may be a problem.  But unlike the new ".xxx" porn domain, the hosts that use the "xxx" pattern some place else in the host name frequently dish up malware.  MDL had nine hosts with that pattern in them two weeks ago and that number seems to be fairly consistent over time with MDL having a half dozen to three dozen hosts with that pattern in them at all times.  Does the rule cause me any False Positives (FPs)?  Very rarely any more, and almost none that are completely catastrophic.  By that I mean that frequently I wouldn't even know the blocks had happened without studying my block logs every month.  Most people have learned to avoid that pattern to not get stuck into a porn list some place.  The pattern "69" OTOH does cause me some FPs from time to time:

PAC Filter Changes

That is the folder that shows the changes that have been made to the filter over time.  I am now making a zip of the entire folder available which will make it easier to get almost all of the hhh_yyyy_mm_dd_changes.txt files at one time.

http://www.securemecca.com/public/pubpaclog.7z

This zip will not have the latest change file until a snap occurs, and may not even have it then unless I remember it.  The other file that will probably forever be out of date is the ComparePac.7z file which compares the latest PAC filter update with the previous one.  This is not just snaps which are represented in the hhh_yyyy_mm_dd_changes.txt files.  The ComparePac.7z folder has files that reflect the changes from the last update of the PAC filter to the current one.  So for these . what ever is in the zip will probably be hopelessly forever out of date compared to what is in the folder.  At least now you have a way to find how something like the "69" rule got exclusions for it over time.  Like ClamAV and Linux, if you have other problems, alter it yourself unless you feel the error would affect lots of people other than just you.  In that case give me a note and I will try to work out a correction.  I already have one of them:

GoodDomains[i++] = ".cf[(0-9)].rackcdn.com";

The other alternative is to just comment the "69" rule out.  A nobbled PAC filter protecting you is better than nothing at all.  Blocking ads is way down on my priority list.  Despite blocking malware being third, that third placing is just there to resolve conflicts.  Blocking trackers, web-bugs, and malware are effectively all at the same level.

Saturday, February 18, 2012

Chrome-Windows-Problems-II

Well, the battle has been lost and maybe even the war.  I just did the following in the preparation to see if I could get rid of the "proxy pac file loaded" messages because of Google Chrome:

  1. I removed all of the debug statements from both the proxy_en.txt and dbgproxy_en.txt files.  So that message isn't even in these files since it was a conditional debug output and now that is gone for good.
  2. Before I had just the C:/etc folder.  I am giving these in the same notation that Microsoft had from the start and what is congruent with the notation that you give in the Internet Settings.  That folder still exists but I have also created a new folder inside it - C:/etc/OneFile/ for the proxy_en.txt and proxy_fr.txt files.  I changed the Internet Settings to have the new file  in the new location - "file://C:/etc/OneFile/proxy_fr.txt".  There should be no contamination of debug statements since they have been stripped from the code.  Even if Chrome is loading every file in the folder I should not get the message.
  3. I removed the Chrome browser including all user settings using the Remove Chrome.  I did this for every user on the system since Chrome has to be installed for every user on the system.  It installs it in the %USERPROFILE%\Local Settings\Application Data\Google\ folder.
  4. Just to be on the safe side, this time around I double clicked on both the AllIEUsersUndo.reg and EachIEUsersUndo.reg.  The last time I went through this I didn't do that.  I also didn't put them back at the end like I did this time.
  5. I used regedit to search for all strings with "google" (case insensitive) in them and deleted all of the relevant entries.  When you see the CLSID keys, don't just delete the Data if they have the "Local Settings" in it.  Be sure you select the key itself and delete it.  If you have SpywareBlaster you may want to disable everything for a while to get rid of those long lists.  I finally concluded their protection was so old I just removed it.  Also, be sure to go through twice.  I did a search for proxy_fr and it came up with my new setting of "file://C:/etc/OneFile/proxy_fr.txt", not the old one of "file://C:/etc/proxy_fr.txt" .  It doesn't matter because either one has no debug statements at all in them now.
  6. I then ran CCleaner with some fairly agressive settings.  If you have your own list of blocked cookie exceptions you may want to preserve those in the CCleaner by allowing Firefox to keep its site exceptions.  If you use CookieSafe's list as is then you can reload what I have provided (with perhaps some additions by them).
  7. I then deleted the folder mentioned in step 3.  Then I double clicked on both the EachIEUser.reg and AllIEUsers.reg files to make sure the PAC filter was active for IE, Opera, and Safari.
  8. I rebooted the machine.  The "proxy pac file loaded" messages had disappeared.  In addition to IE I have both Opera and Safari installed and if memory serves me correct, they didn't pop up that message but since I am rarely on Windows that isn't an exhaustive test.  I did a test to make sure the PAC filter was active and enforcing in IE, Opera, and Safari.  It was working. Just remember that Chrome constantly updates itself, and something sticking that string in is causing it to come up when it checks for an update.  It could be because it read every file in the C:/etc folder.  All I know is that I have no problems with Chrome on Linux.
I went several months without Chrome on Windows, being thankful that the message was gone.  Then I reinstalled Chrome. The  "proxy pac file loaded" message came roaring back with a vengeance.  At this point I am dumbfounded.  It could be that originally Chrome was reading all of the files in the C:/etc folder.  I did count several times and it did seem to match my four files that have debug active.  Here is what the C:/etc/ folder looks like (I delete all of the old files and I no longer make a copy of proxy_en.txt to proxy.txt and a copy of dbgproxy_en.txt to dbgproxy.txt which should reduce the count by one:

Etc Folder

Every time there is an indent that indicates files within a folder.  As you can see, now the only files Chrome should possibly be reading are the proxy_fr.txt and proxy_en.txt files.  It is fine with me if they read both, since all of the debug statements are now gone in both of those files.  Now it could be that it is Microsoft's fault but if it is, why are they remembering the old values?  Why don't I have a problem with IE, Opera, and Safari?  It points very strongly to Chrome as the major source of the problem.   I think they used a method that opens every file in the folder and reads them in one at a time, not just the one specified in Internet Settings.

It is quite possible that Google saved the old information and restores it when I reinstall Chrome.  I will say this again, and again, and again.  Clean State will win over Old History any day.  I can remember a sysadmin doing a restore of my password from 18 months earlier.  Duh!  I have changed the password five times since then and you expect me to remember the sixth password before this one?  Good luck with the other people.  But if Google is saving this information and are doing it to be helpful I can only say that it is definitely not helpful.  But then I think, if they are saving this then what else are they saving about my internal machine state?  If they are saving other things then I don't want their browser, especially on Windows.  At least on Linux I can always do a "ps -eadf" and see if it is there. I constantly monitor my shell startup files anyway since I produce filters and am in the danger zone constantly.  You can never be too careful.

I have filed a report on the problem with Google:

Chrome Bug Report 109996

Will the mighty Google or Microsoft (depending on who ever is responsible) admit they have a coding bug?  I don't think so.  I do know that before October 2011 I occasionally had "object unexpected" messages that may have been this problem but I rarely had more than a few.  But ever since then I get lots of the "proxy pac file loaded" messages if Chrome is installed on Windows.  But now with the proxy_en.txt and proxy_fr.txt being the only files in the the C:/etc/OneFile/ folder, I shouldn't get any and I was still getting them if Chrome was installed.  Now that Chrome has been thoroughly expunged the "proxy pac file loaded" pop-up messages have disappeared again.  The search for the setting in regedit always gives me this string:

"file://C:/etc/OneFile/proxy_fr.txt"

It can't cause the problem because there is no debug statements in the file at all.  I don't mean just commented out.  The debug statements have been completely ripped out of both this file and the proxy_en.txt file.  And now they are the only files in that folder.


Summary

If you want to use the PAC filter on Windows, you can always use it with Firefox.  It has its own settings and even informs you on what it blocks in the Error Console (Control-Shift-J).  You can also probably use the non-debug versions in the C:/etc/OneFile/ folders with IE, Opera, and Safari with no problems.  I am not having any.  But if you want the Chrome browser, don't use the PAC filter!  All you have to do to disable the PAC filter is to double click AllIEUsersUndo.reg file for any administrator user and EachIEUserUndo.reg file for all of the other users:


Google's Chrome browser and the PAC filter are just plain incompatible.  Unless Google or Microsoft informs me the problem is fixed I will install Google Chrome one last time around 2011-09-01.  If the behavior is exactly the same I will go through all of the steps above to get rid of it and not look back.

Sorry

Henry Hertz Hobbit