Discussion in 'other anti-malware software' started by Anti_Spyware, Jan 8, 2006.
I noticed. Does it really matter ? FP was the question.
Well...it sure doesn't seem very "stable".
Had it for about a week, and after I undone some system changes,
*PUFF* ~ it was gone !
I'm a newbie in anything. So be patient with me.
I assume that SiteAdvisor evaluates a website with automated tests and internet is indeed ENORMOUS.
I have no idea how long it takes to run all these tests for ONE website.
Isn't it possible to run these automated tests, when you right click on the website (or anything else) and give a report and rating in a popup window after running SA on one website.
In that case you can run SA on every website, if the user wants it to run.
If I know the website is safe, I wouldn't run SA and I know alot of safe websites.
If I don't know the website is safe, I would run SA and read the report of SA, to know what SA thinks about this website.
So I'm talking about a software, installed on my computer, that runs several tests on a specific indicated website by me and gives a report and rating when the tests are done.
Even when the tests take some time, it's more important for me to know what SA thinks about an unknown website
and I'm prepared to wait for this if the time is reasonable.
In that case, you can run SA on each existing website on the internet.
I hope you understand what I'm talking about, because I'm not that good in English.
plug-in disappearance; data completeness
Nat, if you "undid some system changes" (e.g. using a system restore tool?), you probably removed the registry entries that cause SiteAdvisor's plug-in to tie into your web browser. I don't think you can fairly blame this on SiteAdvisor. This is normal and *correct* behavior. Do a rollback, and your configuration actually does and should roll back to your prior settings.
Erik, You're right that some of the tests at issue can be done very quickly. But others really can't. A great example is SiteAdvisor's email testing. To test whether or not a site sends large amounts of email, the SiteAdvisor robots register with the site, providing a single-use email address (not provided to any other site), then wait weeks or even months to see how much mail arrives. You can't accelerate that process to mere seconds; even the worst spammers won't send a new signup that many messages in that little time.
By focusing on the most popular sites, SiteAdvisor already has data about most of the sites most users visit most often. I know that's little comfort if they don't have some sites specific users care about. But this should only improve over time. I'm hopeful that data completeness won't be a big problem. Let's give SiteAdvisor another month or two -- at least through their official public launch -- to see how this turns out.
Re: plug-in disappearance; data completeness
Excuse the little misunderstanding...but I was posting my message mostly
in a jokingly way.
That's why the " " around the word 'stable'.
I restored settings from a couple of system tightening programs, because I wasn't able to download anything from microsoft. ( UPDATES people, UPDATES ! )
Well...after that, SA just disappeared from my browser, lol.
And you are right ! This is 'normal' !!
Now..one question. I see SA is in my Add/Remove Programs folder.
Do I have to uninstall it from there and THEN Re-install ?? I think it would be easier - for me that is.
I am not too sure how "rollback" works.
Amateur with technically thingies.
Also. I really like this program, and I think it will in time be a good helper when surfin the net.
Though, I don't and would not trust it blindly. Too young still. But I definitely see great potentials. I hope you understand what I'm trying to say.
My english is worse than that of ErikAlberts.
I sure will give it another month or two. Improvements need time.
P.S. I surfed a couple of sites that in my knowledge would have needed the red light..the least, a yellow one.
This site was marked as green.
Keep up the good job though...this is a program I've been waiting for in ages !!!!!
How experts can help resolve false negatives
I don't use rollback much either. In any case, I think you should be able to reinstall directly from SiteAdvisor's download page. I've done plenty of in-place reinstalls, overwriting prior versions. Always works just fine. And this is similar to what you'd be doing (overwriting a partially-removed prior version).
If you find false negatives, it is always very helpful to submit them as comments. Just click the browser plug-in's button, view the site dossier page (e.g. a URL of the form http://www.siteadvisor.com/sites/benedelman.org ), then scroll down to User Comments and type your comment. One to two sentences is fine. Then someone from SiteAdvisor will investigate, confirm, and if warranted update the site's rating appropriately. I realize this is a bit of a pain, but if each initial expert user sees and submits just a handful of false negatives per week, it actually may not take long to get these resolved. Also, identifying errors is very helpful in improving the automated analysis -- tells us where the crawlers are getting things wrong, which helps us redesign them so the next crawl is that much more accurate.
If you do take the time to submit comments, I recommend creating a SiteAdvisor account (1-2 clicks extra) so SiteAdvisor staff can get back to you with questions, comments, thanks, praise, T-shirts, what have you. Seriously, they're greatful to those who take the time to write, so experts might as well put their names on your submissions so they can get credit where credit is due.
Or for anyone who doesn't care to submit comments through the SiteAdvisor feedback system, I'm happy to receive them by email.
Thanks Ben !
Will use your recommendations... !
Re: plug-in disappearance; data completeness
Thanks Bedelman and I fully understand that some tests like emails can't be done with my proposal, because they have to be spread over a longer period. You can't blame me for trying.
Of course you could separate the short tests from the long tests in two different softwares.
The short tests would give up-to-date information, because they are done immediately,
while the long tests would be kept by the SA-team, which also means that the SA-team doesn't have to keep all the information of each website, only the data of the long tests.
It depends on what kind of info the short tests give, that are important enough for the user to be up-to-date.
i dont know why some game hack sites are rated green
Metallicakid, if you think you've found an error, here are a few ways you can proceed:
1) Visit the site's dossier page, e.g. http://www.siteadvisor.com/sites/crackz.ws . Scroll down to the User Comments section, and enter your comment. If possible, create an account (very quick & easy) so SiteAdvisor staff can get back to you with questions, or just to say thanks.
2) Send SiteAdvisor feedback.
3) Email me.
SiteAdvisor really does value users' comments and feedback. Submit a comment via the #1 method above, and your submission will immediately be visible for the world to see. Furthermore, a SiteAdvisor staff person will look over your submission promptly, verify your claims, and adjust the site's rating if appropriate. So these comments don't just go into a "blackhole"; to the contrary, you should see that your well-founded comments produce significant changes.
Re: SiteAdvisor data, completess, and timeliness
Thank you for taking the time to respond. I would imagine that you are quite busy, so taking time to post here is most appreciated.
I see Siteadvisor as being a first step tool in the user being more proactive about protecting their systems. Not as a blocking tool, like anti-virus or anti-spyware tools, but as a “look before you leap” tool. “Fore warned is fore armed” as the cliché goes. This is one of the first tools of its type, and I am very excited about its prospects.
I have never liked our current approach. I have always believed that an educated user will always do better at protecting his system than any software and hardware alone.
I’ll give you an example: I work for a very large corporation as a network systems engineer. Our network is protected very well and we do a better job than most. One day I received an e-mail from a known source, but it had a strange attachment, so I did not open it. I immediately sent out an e-mail to all of the people I support and told them not to open any e-mails with attachments because a virus may have entered through the Exchange server. I then contacted our security office and notified them about the e-mail. They already knew about it, it was the “I Love You” virus, and it was spreading fast. I hung up the phone and it immediately rang. It was a user who had received my e-mail too late; he had already opened the infected file. I told him to unplug the computer from the network and shut it off until I have the tools to clean it.
Education will go a long way in helping to slow down the spread of viruses. When I think about Siteadvisor on an enterprise level, I imagine all those users avoiding (hopefully) all those sites with red Xs. If they do avoid those red X sites, a lot of company resources will saved for more productive tasks.
In the meantime, I will continue to look at Siteadvisor and see what it can do. I will continue to send in my comments and recommendations as well. So far, the responses from the Siteadvisor staff have been very receptive. Although on my last query, they did not answer my question about robots.txt and ROBOTS meta tag. My understanding is that these are used by site owners to respectfully ask bots not to scan their sites. I will send them another e-mail.
Which reminds me: I recently submitted my own site to Siteadvisor. It has the robots.txt file in it. My site does not show up in Google or Yahoo!, but it does on Siteadvisor; with the links followed to another site that I would like to keep private. Not that I am upset mind you, I used robots.txt to keep my site private as I could, and this was an opportunity to test Siteadvisor. I only want to share the site with a few friends.
This to me seems to be another possible sticky point. In my opinion, it is common law practice to respect the requests within robots.txt file. That companies like Google and Yahoo! have set a precedent by respecting the site owner’s wishes. By ignoring this file, bot owners open themselves up to litigation.
Personally, I believe that if the site is green and the robots.txt says to disallow, then Siteadvisor should not publish its findings on the site. If it is red however, then post it in neon, flashing lights and horns blaring.
Re: SiteAdvisor data, completess, and timeliness
After my last post, I had second thoughts about this last paragraph. I changed my position because I am a site owner who would like to have his wishes respected by bot owners. I use the robots.txt file on my personal site and I would like bots to use only the permissions that I have assigned. My interpretation of the protocol is that a bot is supposed to read the robots.txt file first to find out what it is allowed to do. If it is allowed to proceed, then it should proceed only as directed. If it is allowed no access whatsoever, then it should just proceed to the next site. I also believe that every responsible site owner who uses the robots.txt file has the same interpretation.
In my opinion, it is common law practice to respect the requests within robots.txt file. Companies like Google and Yahoo! have set a precedent by respecting the site owner’s wishes. By ignoring this file, bot owners open themselves up to litigation.
I submitted my own personal site to Siteadvisor and it has already been analyzed. The information has been published on the Siteadvisor site. My site came up all green according to Siteadvisor. The published page shows the address of links on my site which it is in the process of analyzing. So in my case, my robots.txt was ignored, as well as my wishes. I simply do not want bots scanning my site and I would like that respected.
I sent an e-mail to Siteadvisor and they recognize this issue. This paragraph from their response summarizes their policy on this issue:
I believe that Siteadvisor needs to revise this policy. At the very minimum, they should not publish pages about green sites that use the robots.txt file to stop bots from scanning their site. They should also not publish information about those parts that are restricted from bots. This to me would be a compromise solution that legitimate site owners could accept.
Illegitimate site owners will cry foul however, and that is when Siteadvisor may have to change its policy to the letter of the intent of the robots.txt protocol. Siteadvisor personel may have to go to bot blocked sites and manually test them. This would cripple them if they had to resort to it. Only time will tell.
For those of you wondering what a robots.txt file looks like, here is one:
These robots.txt questions are fascintaing. Frankly, it's not always obvious how to proceed. On one view, the spirit of robots.txt is to ban all non-human processes. On the other hand, that creates an easy opportunity for bad guys to escape SiteAdvisor evaluation (and inevitable criticism) merely by adding a simple robots.txt file. I think this is a more serious worry than your message indicates.
On a related note, I'm not sure robots.txt has as strong and far-reaching an effect as you suggest. I certainly don't agree that robots.txt has common law significance. But even taking the robots.txt standard for what it is, recall that it's really a standard for search engine type high-volume crawlers (which potentially read thousands of pages on a site), not for Windows-based automated users (that better simulate ordinary users' accesses to a site, and crawl only a handful of pages).
In any event, I think SiteAdvisor's stated response gives a good statement of what SiteAdvisor currently does and why.
I don't have much to add to this. I think it's a sensible policy; I think testing less would ultimately do users and the web a disservice. Testing less might provide some very small benefit to some web sites that prefer to avoid being evaluated, but I'm not sure that's an important benefit, frankly. Consumer Reports doesn't ask manufacturers whether they want their products evaluated, for good reason (becuase only the bad guys would opt out), and the same idea applies to SiteAdvisor too.
MSAS and SiteAdvisor
MS Antispyware gave me alert that site advisor is
trying to add 101egreetings.com in my TRUSTED SITES LIST. I wonder why
this message is coming? Here is the reply from Support.
---" thanks for writing and for your interest in SiteAdvisor. In our
Internet Explorer version, our client software adds a list of sites that
we've detected as having ActiveX adware to your Restricted Zone to give
you extra protection. We disclose this on the Internet Explorer
pre-installation page, here:
Note that SiteAdvisor is trying to INCREASE security on these sites
by adding them to the Restricted Zone, which will prevent ActiveX
controls on these sites from installing/running.
We are considering removing this feature in the next version because
we've noticed that when some anti-virus software detects our additions
to the Restricted Zone, users are prompted with an alert that makes it
sound like we decreasing security (when we are actually increasing it).
----" Thank you for following-up with this. This is why we are removing the
Restricted zone addition in our next public release. MS Antispyware
reports any addition to security zones as an addition to the Trusted zone
-- which is really a bug in the Microsoft product -- if you were to
allow SiteAdvisor to make the change, you would see that the sites are
really added to the Restricted zone.
Still, we know many users will see this message from Microsoft and
rightly be very concerned by it. In our next public release it will no
longer be an issue and you will not see this message again."
Cyberhawk gave me alert also but it was a log, so I was not able to understand that site is added to trusted sites or restricted ones according to CH. I checked fromm IE, SiteAdvisor is right, they add it to restricted ones but reporting by MSAS is wrong. Interestingly, NIS and WinPatrol did not gave any warning.
well there goes the free version
From where do you get that? As the main page of siteadvisor links to msg from CEO, which states that " But a few things won’t be changing: * The free features in the current SiteAdvisor software. They will remain free to our current users and will continue to be available for free on our Web site for new users."
I think free version will go in any way. I don,t think they will be working for free forever.
But it is strange that it is taken by McAfee while still in infancy!!
Hello guys! This is not 1995 when people did everything for free on the internet. Do you really think that some guys thought; "Hey, let's just help out the internet communiy for free". Maybe there are some people like the make of S&D who needs to do a reality check, the thing however is that there is lots of money to make out there. Site advisor is just another attempt to see what you guys are doing. Heck even if you uninstall it needs to ping back home.
This practice is really bad. I have experienced it with SiteAdvisor, SpySweeper trial, Google Desk Top and many others. It,s really so annoying. Shame on them. They must give the user an option for this.
I always knew that they were gonna sell to someone. I would have prefered that Google was the buyer.
Smart move by Mcafee.
Buys ready made "site Rating' software with big db in place.
At a web page coming to yuo soon :
McAfee certified safe; LOL
How much revenue do you think it will generate when mcafee starts selling its approval rating.
How high will the mcafee approved sites move up the google listings.
How low could you go if macafee gave you a red rating!!??
?Reverse Blackmail by proxy
All speculation of course
Dont forget the massive numbers involved.Could be a very good deal for macafee.
so how do you explain Firefox
Hatred against Microsoft?
Separate names with a comma.