Lol, I've read so many discussions about "microsoft shall remove registry" on the web, never the defense of this move appointed to a solution that would fix all the problems and not raise new ones. This recent (June/2011) thread on Ars Technica has many arguments that address many points raised here. The OP begins the thread saying that "I think Win8 will not have the registry". I'm not willing to repeat everything posted there. Good luck on your reading.
The registry pro/con should be its own topic for sure. It could easily over-take this thread I would imagine. Not that it matters though, as it will remain due to compatability issues, the same as those "ghost folders" dw426 refers to. One day though, they may just drop the legacy support and start anew. Sul.
It's less about them bypassing those protections and them having to look elsewhere for exploits. We've seen how effective V/7 have been, they aren't targetted nearly as much. Instead we have 3rd party under fire because they haven't kept up. Imagine telling users "You can't install applications without logging out, logging into a new account, installing, and then switching back." You think anyone wouldn't just stay on the admin account? Anyone who wouldn't would probably be running LUA as it is. And Linux and Windows do the same thing, both are put on fully capable accounts and you can elevate with prompts - the difference is that Linux defaults to limited and elevates to sudo root but root is still attainable as a user just as in Windows. The only time there are issues with the registry is when people mess with the registry with scamware registry cleaners etc. It's like having one big config file instead of multiple little ones, which has advantages just as it has disadvantages. It's really not a big deal, especially with Vista and 7 creating virtualized registries for programs. I see the registry pointed to all of the time by linux users and I think it's gotta be someone spreading rumors and people just catch on. Really I have never seen any other system that provides any real solutions or benefits. I'm running the dev preview. I think the beta is due in January and the release is due in August. I could be mistaken.
Im wondering what percentage of infections are exploits and what are social engineering. You cant blame the OS for a user giving their admin password to a malicious file. I dont like the fact windows 8 still wants to create the first user account as admin and then assign it to a windows live ID as well. does anyone here know if the first user account created on OSX lion has admin rights?
Source: http://www.microsoft.com/security/sir/archive/default.aspx Yes it has. Reference: http://support.apple.com/kb/HT4796 On the article's text you can read this snippet: "The first account created on the Mac is an administrator account."
I can and do. And every major OS by default will give you a fully capable admin account or an account capable of escalating to admin because it would be so ridiculously inconvenient to do it any other way. User interaction is vague. If I click on a link and it takes me to an exploit page it's both user interaction and an exploit. If I visit a page and it's exploited and I get a UAC prompt it's both an exploit and user interaction. If a user is deceived into running an exploit on a webpage/ has intent to go to that webpage their flowchart doesn't really work. It's either you're deceived to run malware or it's a vulnerability and you can never be tricked into running a vulnerability.
That is ridiculous IMO. There is no way possible for an OS to be used without an admin. If the admin is fooled into running an exploit, or haphazardly executes an infected file, there is no way to control this. How can you blame the OS for what the user does? I don't understand that at all. Now I can see the argument that the OS should not make it easy for everyone to be an admin, sure. But at some point, an admin will be required to do certain things. Corporate environments pay people to admin responsibly. They own the workstations, and thus can make decisions on what the end user can do. In those cases, the end user is restricted by an admin, and problems are much much less likely to develop, if the admin is doing thier job the right way. Homeowners as admins? Well, unless you want a canned experience with the OS determining what you may or may not do, how do you keep homeowner admins from making mistakes? Impossible. I would never want to own an OS (let alone pay hundreds of $$ for one) that determines what I can or cannot do (install/run/view/etc). I would rather take matters into my own hands, which may be dangerous if I don't know what I am doing, but at least lets me decide what I want to do. IMO you can blame an OS for a flaw that lets an exploit happen without the user having any interaction at all, but you can't blame the OS for what a user does. You might as well let some helpdesk person in redmond decide if you can install some new software/driver or view a certain webpage if that is what you are seeking, for the OS to be responsible for all problems.... The bottom line is very simple. If users are allowed to make thier own decisions, then they also bear the responsibility of thier decisions. Maybe there should be an OS that is micro-managed by the manufacturer, maybe subscription driven, for people who don't want to be responsible. That way they can be told what is safe and what is not, and perhaps stay problem free until a hacker/cracker pops them.. then they can truly blame it on the OS and not thier own actions. Sul.
@Sully, you are bang on 100% correct. I could not agree with you more, and you stated it with utter perfection @Hungryman, without question you are very intelligent, as many of your posts show, but this one and only stubborn conviction you maintain about how the developers of the OS is responsible and at fault for giving admin privs to malware is, as Sully put it, RIDICULOUS. No other way to put it and no way to sugarcoat it.
It's a whole separate topic really. I wouldn't want to derail this one too much. Suffice it to say that I think Microsoft should account for users making bad decisions and in the end it's the devs job to ensure that the user either always makes the right decision or that there is no wrong choice.
Wait, how would you possibly ensure a user always makes the right choice, outside of rigging their keyboard and mouse to a car battery? No wrong choice how? You're talking about an OS that has to work with possibly hundreds of thousands of individual programs, all with their own behavior, many legit, many bad news. There's no safety measure on Earth that can account for all bad user behavior or programs..save for unplugging the box from the wall. Nothing exists that can't be bypassed either manually or through trickery/all out assault. What would you possibly have in mind to account for all this? You expect MS, the company that brought you the blue screen of death and Patch Tuesday, among other gems, to pull this off? Lol, can I get a puff of what you're hitting? I get you, Hungry, trust me, I do. But your post is asking for the impossible, technically and, more important to them and most companies, financially. I too believe much, much more can and should be done, but expecting miracles with this sort of thing is just foolish.
You just said though you expect them to account for users making bad decisions. That's still just as complicated really, as they have no idea what a user will do, or what a program will do. Do you mean 3rd party software devs should not allow users to make a wrong choice? Where does that leave software such as HIPS, firewalls, and anti-malware? With anti-malware, it's easy enough to force an automatic decision to delete, but we've seen numerous times what happens when a bad definition goes out, or an FP is encountered. Now, of course you'd still have to make auto-delete optional...otherwise you're going to have a really bad day when your customers come for your head. So, keeping it forced is out of the question, and keeping it optional is no better at solving your issue than what is already done. Like it or not, things are going to go wrong whether you leave in the decision process or leave it out. I dare say more problems will occur taking away decisions than keeping them in place. There's no real nor easy solution. Users don't like restrictions and programs are still developed with the ability and need to do things they really don't need to be doing (admin privileges, hooking into too much, that sort of thing). Virtual software is as close as we can get to taking all things into account, and being able to either stop things from going wrong, or start over when they do. But, even then, it doesn't solve it all, and malware will increasingly target them if they ever reach mass use.
Virtual software is as close as we can get to taking all things into account, and being able to either stop things from going wrong, or start over when they do. Exactly, which is why my kids (the worst user-decision-makers there are) don't get to touch a machine without it. malware will increasingly target them if they ever reach mass use Am hoping such s/w stays relatively obscure! philby
Where? I said they should, not that they will. Behind. Well, not firewalls. Definitely. Deleting files shouldn't happen without definitive results. Forcing deletes is certainly an issue. That doesn't mean you can't force other things. The problem is figuring out which things to force and what program to force those things on. Virtualization is nice. It's not nearly enough though on its own. And all virtualization software today relies on the user to either force a program into a sandbox or let a program out of a sandbox. That's a decision that needs to be removed. It's about minimizing the effects of user choices and eliminating them where possible.
Re: Will you upgrade to Windows 8 and why? Same here. Plus unless it changes I just don't like what I have seen so far.
Re: Will you upgrade to Windows 8 and why? Totally agree. I have 2 computers running XP and a new laptop that came with Win 7.