Re: Securing Your PC and Data
Approaches to maintaining a clean system
Computer security is about control, control of the following….
- Operating fidelity of the computer to provide maximal system uptime. You want your system to work, work as intended, and work now.
- Inbound/outbound communications. Your system should serve as your communication tool, not some other individual's platform. That means having some measure of control with respect to communications. In general, that means blocking unsolicited inbound, preventing undesired outbound communications, and maintaining encryption of wireless communications.
- The privacy and security of personal information/credentials used to manage your affairs
- Overall system performance
There are a multitude of paths to nominally equal states of security. It is important to understand that while the paths may be very different, the realized goals may be virtually identical. The different approaches taken often reflect balancing different traits, relying on different levels of expertise, or favoring distinct design aesthetics.
Invariably, to maintain system fidelity, you need to start with a known good
state and keep it. This can be implemented system wide or in discrete areas of the file system. It can be a permanent change or invoked as desired. Examples include:
- Native configuration/features
- Objectively speaking, the default configuration of Linux and Mac OS-X workstations is suitable for out of the box use as is. The combination of package management, running as a non-admin user, and relatively small user footprint combine to make this an unproductive target. However, it should be recognized that even if this population were a large and valuable target, the base configuration of these systems is quite strong as is. Whether or not a blacklisting antimalware (AV/AS/etc.) is needed on Linux/OS-X systems been discussed in many venues, including here (see Antivirus Necessary For Linux ? or Antivirus for Linux). The simple answer is no – however, that is not a statement that these platforms are immune to security issues, just that the AV component is currently not a needed facet (a couple qualifiers - this assumes the machine is a simple end-user PC, not used as a file server for Windows PC's, and not used as any sort of Windows gateway - in which instance scanning for Windows based malware may be a desirable end goal).
- Windows configured as a limited user. Using SuRun with XP or maintaining UAC active with Vista and Win7 yields a good generalized solution. WinXP limited user augmented with SuRun is basically how limited users should have been implemented. Most issues involving badly configured legacy programs can be dealt with, but a user really shouldn’t be forced to do this. Key links: SuRun: Easily running Windows XP as a limited user; Maximizing Windows XP security with LUA and SRP ; SuRun - SUDO in Windows – Tutorial; SuRun site; Maximizing Windows VISTA security with LUA and SRP (even without ultimate); For Users of Software Restriction Policies
- Software Restriction Policies (SRP) are an excellent extension along the lines of using the native capabilities of the OS. The main issue is that SRP’s can require a higher level of user intervention and expertise. However, there are guides available to assist users of almost any level. For example, see Ruin a malware author's whole day with a Software Restriction Policy! : ) by mechBgon or PGS - Pretty Good Security by Sully. Either link takes the pain from SRP.
- Strengths/Weaknesses: The primary strength of this category of approaches is that they take advantage of native capabilities of the OS. Arguably, the default configuration of Windows should go this route, and while that has partially happened with Vista and Win7, the “everyone’s an admin” boat left the dock a long time ago and that precedent is difficult to undo. Furthermore, sloppy coding practices have helped solidify the notion that casual users can (should?) run with Admin level credentials. The simple fact of the matter is that routinely running as an Admin level user is insane. The primary weakness (in Win XP) is that some level of user involvement, and potentially inconvenience, is required to implement limited user accounts and so on. By the same token, this is the default implementation in the Linux/Mac OS-X world. Eventually, this should also be the case in a Windows environment, unfortunately it appears that this will not occur until Windows XP has passed into disuse.
- Execution control/Monitoring
- Static execution control: This category fixes the configuration and set of applications that are allowed to execute. It is a pure default deny approach. As such, it is extremely robust. No decisions are taken on the fly. Commercially, the applications AntiExecutable by Faronics or Executable Lockdown from Horizon DataSys are examples of this genre. This is a powerful approach, but can be maddening in situations in which the installed application base changes on a regular basis. For systems in which a very static software configuration is the norm (public access PC’s in shops/academic settings, institutional computers), this approach excels and is highly recommended and extremely straightforward to use.
- Dynamic execution control: This category of application allows you to render on the fly decisions with respect to application execution as well as the scope of the process. It is an exceptionally powerful scheme, but with power comes responsibility. To varying levels, providers of this type of application attempt to assist the user in rendering a decision. The structural problem with this type of application is the level of user knowledge and intervention often required. In some instances it is possible to reasonably quickly navigate this need (e.g. a session dedicated to the purposeful launch of the installed applications and configuration of the package). Examples of this category of application would include offerings such as Defensewall, GeSWall, and a host of other applications, some current, some abandoned. While powerful, it remains a niche type solution. However, if the application base on the machine changes on a frequent basis, this type of solution is much more suitable than the static execution approach described above.
- Kareldjag's Security Overflow site contains a number of evaluations of this genre. See in particular An Overview of Personal Desktop/Host IPS, An Overview of Personal Desktop/Host IPS 2, and Host Intrusion and Prevention Tests. While much of the content was written a few years ago, the information remains useful today.
- Strengths/Weaknesses: It is hard to argue with the concept of default deny. It is extremely robust, time tested, and transparent in implementation. Static execution control configurations are default deny, while the dynamic approaches are deny unless you change your mind on the fly. Both approaches have merit and it’s a trade-off between tightness of security and convenience in use. That’s a trade-off that weaves through the whole of PC security. On a truly static system, there is probably no more robust solution than static execution control.
- Think of virtualization as “security” through the use of a transient and completely disposable computing environment. This isn’t security in the usual sense of the word, although there are very clear security implications to having “disposable” computing environments. Virtualization limits the scope of any malware incursion to a defined session and reduces system recovery to a restart level event, which for any user is a potentially enormous benefit. The scope of a virtualized environment runs the gamut from application level to partition based to a complete virtual system. As one might suspect, the required computing power increases with the scope of virtualized environment, but most current (2009) machines with suitable RAM are able to readily run any level of virtualization.
- Application level virtualization or sandboxing: A clever, resource light approach to maintaining system fidelity is to allow applications to run in isolation from the remainder of the system, their own sandbox so to speak (hence the terminology). Sandboxie is the most recognized and probably the most popular variant of this genre. Some antimalware suites have implemented limited versions of sandboxing (e.g. Kaspersky Lab KIS 2010)
- Light/partition virtualization – The next step in this category involves virtualization of complete partition. Typically, the default is for the OS partition to be virtualized, but some applications allow virtualization of other partitions as well. A couple of general threads are: Light virtualization: Returnil/PowerShadow/ShadowDefender/ShadowUser Pro and Light Virtualization - the first year..... These applications are fairly mature, but started to receive wider usage with the advent of being able to enter into virtualization without necessitating a system restart. While a seeming minor development, this removed a major user inconvenience in usage. Exit from a virtualized session still requires a system restart.
- Full machine virtualization: Complete machine virtualization tends to be employed by users seeking to create and use disposable environments for software and other testing purposes. In general, CPU and RAM demands are somewhat high, but well within the reach of upper middle to high level consumer PC’s. A group of paid and free full system virtualization products (VirtualBox, VMware Workstation, VMware Server, VMware Player, Microsoft Virtual PC 2007) are readily available and have been discussed at length in a number of threads and external resources ( VirtualBox, Virtual Machines, Virtualization (one section of Mrkvonic’s informative site), VirtualBox 3.0.4)
- Strengths/weaknesses: For most users, the most difficult and time consuming phase of dealing with a malware infection is the recovery phase. Virtualization schemes render this step fast and trivial. Weaknesses reside in how the user manages the virtualized state. Typically, full machine and partition virtualization rely on a user decision to enter a virtualized session. It's possible to do this as a matter of course, but that's actually somewhat unusual. Application level virtualization has somewhat more flexibility here in that Internet facing applications for which downloaded content is transient can be readily configured to always launch in a sandboxed environment. In all cases, users can employ a variety of approaches to continually work from a virtualized system and selectively retain information as desired. However, that starts a cascade of user based decisions, made on the fly, and this basically subverts a part of the basic reason for virtualization (facile session disposal/reset)
- Rapid recovery/partition restoration
- These applications are focused solely on maintaining maximum system uptime via the ability to quickly restore partitions to a system using a prior saved state. There’s some similarity between this class of applications, virtualization, and classical backup strategies with respect to the end result. However, in terms of the operating mechanism, user flexibility, and speed with which these programs can function, they are clearly a distinct class.
- The gold standard is FirstDefense-ISR from Leapfrog Software, which is currently marketed by Horizon DataSys as FD-ISR Rescue, although an end of life statement seems certain in the near future (see Is First Defense dead?)
- Horizon DataSys is also the vendor of RollbackRx (this program is also available under the EAZ Fix moniker in some locales – Horizon DataSys and EAZ Solutions are said to be ”sister” companies). RollbackRx works in a completely different fashion (via sector level redirection managing what’s effectively a phantom file system on the drive). The advantage of this scheme is exceptional speed to restoration (seconds vs. 10’s of minutes by other approaches). The downside is that disk manipulations performed outside the RollbackRx environment will corrupt the sector map and render the drive unusable. This detail needs to be appreciated by users of the product and it does limit offline manipulation of the system.
- Live CD Linux
- A Linux Live CD is a complete environment that can be booted from CD at any time and does not engage in any modification of the system. It is the classic example of a disposable computing session and provides a completely static and disposable environment.
- Numerous offerings are available. For additional information, see the *nix resources: a list... thread in the all things UNIX subforum.
- Classical blacklist AV/AS type solutions
- This approach attempts to maintain a clean system by identifying known (or suspected) malware as it appears on a machine and removing it.
- As a largely reactive approach, it is always playing catch-up. Given the explosive growth in malware, this eventually runs into a scalability problem of the response time being unable to keep up with the onslaught. That end-game has not appeared as yet, and it’s still somewhat off in the future, but it lurks as an almost statistical certainty.
- Vendors in this segment have expanded their toolbox to include some of the other approaches mentioned above, and this provides appreciable benefit. Suites, which combine a number of blacklisting approaches with an integrated firewall and, more recently, features such as application control, allow consumers to implement multiple approaches under the guise of a single package.
- The death of AV’s has been proclaimed for years – see from 2007, 2006, 2002, and there are likely earlier examples. There’s a trend here, and it still seems a premature burial.
- Although malware is a global issue and the spread of malware is certainly aided by the connectivity of the Internet, geographic and language localization can matter. For this reason, don't dismiss smaller local offerings that are developed for and concentrate on supporting local markets. Specific examples would be offerings in areas where English is not the dominate language (e.g. China, Eastern Europe, etc.). Vendors based in these areas may not possess the brand name recognition of the larger global vendors, but they can be quite suitable in the local region since their offering will generally be focused on locally active threats.
- The purpose of an AV is to let you know when it sees a file that’s malicious. As a blacklist, it does not validate as good, it identifies as bad. However, absent an expert file identification system such as an AV or related product, how can a regular user identify malware?
- Code analysis of the disassembled file? Ummm…, right…
- Running the file in question and hoping for the best? Chancy in most situations.
- Rely on download source? Actually not bad (compare with package management under Linux), but not always obvious.
- What are we left with then..., there really isn’t a good mechanism for regular users to assess whether or not a specific file is malware save for a blacklist type approach, even with all its negative attributes. It’s here to stay in some form as long as uncontrolled content can be downloaded from the Internet.
- Finally, if you need to ask someone else "Do I need an AV?", the default answer is most assuredly yes. When you're comfortable with your own counsel in this area, you may be ready. I'll leave open the possibility of foolhardy, but if you've seriously assessed the relevant material, you should be equipped to deal with most eventualities if you choose to go that route (by the way - it's not a path I recommend on Windows based machines).
- Which to choose? Here are some Wikipedia and other based listings of some options in the various program categories:
Almost all of the major vendor products will yield operationally equivalent behavior in most users' hands. They all cover the widely circulating examples of malware to which you're most likely to fall victim. If you happen to expect malware challenges on a frequent basis (note – realistically frequent mean more than a few times per year) perhaps a best-in-class product as indicated in head-to-head testing (see challenge tests by AV-Comparatives.org or AV-Test.org) should be your product choice if you employ this approach. Note that the various listings mentioned need to be constantly refreshed to maintain current information. For that reason, consult active forums (e.g. Other Anti-virus Software for the most current information and AV Screenshots Thread for a peek at what many of the offerings look like in use).
It should be apparent that maintaining system fidelity can be achieved by many paths, some very divergent in approach, others with multiple overlapping traits. The key thing is that all these schemes work, some work best in specific situations, some can be readily combined, and others are combined at the users' risk.
It's also true that one can dispense with any of these solutions altogether, and experience no ill consequences. However, simply because you can
successfully do something does not mean that it's prudent to do so. One example hierarchy of solutions could be as follows:
- Start running under a limited user account (XP) or keep UAC enabled (Vista/Win 7). Under XP, install SuRun capabilities. Note, some software still does not work well within a limited user account. SuRun can generally resolve this compatibility issue in a facile fashion. If LUA/UAC are so great, why aren't they used universally? There are likely many reasons and rationales, but three major ones are....
- With Windows XP, LUA is not an enforced default. If you want to direct someone to do something, force them to make an active decision to not do it, in other words make it the default.
- Sloppy coding that "requires" administrative level privileges when they're not really needed.
- Hubris. That may appear to be a harsh assessment, but it seems borne out in many LUA/Admin level usage discussions held in the past.
- Use an AV. Implementation details (real-time monitoring vs. demand only scanning vs. web-based online scanner or web multiscanner) are left to the user. There are a range of configurations one can use, spanning from continuous real-time monitoring to simple demand only evaluation of specific files. In all cases the role of the AV is the same - it acts as your personal expert system to evaluate whether a file has malicious traits or content.
- Consider adding a virtualization solution if you frequent seedier elements of the web. For casual usage, application level/sandboxing or partition level virtualization is recommended. For software testing, go with full machine level virtualization.
- If you like to play with malware (self education, etc.) or desire to exert complete control over a machine, employ some type of dynamic execution control measures and employ a firewall that easily allows very granular control. If you go this route, in general you've decided to replace an AV's expert evaluation with your own in response to alerts regarding execution control.
- Finally, although anyone could argue that LUA + AV + virtualization + HIPS + firewall used in combination could yield a really secure machine (and yes, it would in principle), don't go down the road of implementing an excessive patchwork security construct. There are too many downside possibilities.