Securing Your PC and Data

Discussion in 'malware problems & news' started by BlueZannetti, Aug 30, 2009.

Thread Status:
Not open for further replies.
  1. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    The material below discusses approaches and issues related to securing your personal computer and the data resident on it.

    Since security is all about balancing risks against benefits, there's an implied calculation behind the development. It is critical to recognize that there are an endless number of implied calculations that anyone could make, all leading to slightly to significantly different final states. That divergence in approach is fine as long as the final goals, a secure malware free computer with measures implemented to insure long term data integrity, are kept in mind and are actually achieved.

    Due to the length of the thread, an MS Word document of the complete content is attached to this initial post. Use and share this document freely. It is by no means comprehensive or the last word on the topic. For some it will be both a start and an end, while for others it will simply be a start on another path to securing their system. In any event, my hope is that it will provoke a bit of analytic and introspective thought into developing your own approaches to securing your personal computer systems.
     

    Attached Files:

  2. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    What this thread is…

    • The purpose of this thread is to provide a very basic guide on a number of distinct topics. Basic is the key word here. The average user is the target. The underlying approach is more of informed recipe development than in-depth coverage. If you find yourself asking “What do I need to do if…..?” or Where do I find information on...?, hopefully you will leave with a clearer sense of the options available, possible answers, how to go about developing your personal solution, or available resources to consult.

    • A reasonable listing of the available technical options (products, past discussions, etc.) in a number of categories. The value of these links is time dependent. Products and product categories come and go. The base OS changes. Targets change. Performance waxes and wanes. Keep this in mind when reading this or any other guidance.

    • A genuine attempt to provide some context and nuance. Nuance seems a lost trait in many discussions of computer security and privacy.

    • A bit of a call to action. Folks, many of you have slowly, almost silently, acquired valuable estates of digital content. Personal photographs, purchased music, financial records, serial keys to activate downloaded software, and so on. In some cases the content cannot be replaced if it becomes corrupted. Seriously, it’s time to implement a data protection and recovery plan in your personal computing world.
    What this thread is not…

    • A replacement of contemporaneous threads. As with any fluid situation, the most current information is likely to be available in current discussions being held here and elsewhere. What's often missing from these current discussions, in part because this is not their objective, are the overarching common truths and general lessons developed over time. In other words, taking stock of the global situation from a reflective perspective to summarize and organize learnings.

    • You will not find the one true perfect security setup below. That’s like reaching a temperature of absolute zero – you can get close, but you can never get there. Absolute security doesn’t exist, however rational risk management does. Although there are no guarantees of absolute protection against future threats, reasonable steps can be taken today to improve your situation and those steps can be based on a rational analysis.

    • This is not a platform from which specific products will be recommended. A number of specific products will be listed, but the listing is not comprehensive and those listed are not recommended examples. Invariably, they will be decent choices among many equally deserving options. In general, coverage will center on tools/approaches that have reasonably wide coverage here or elsewhere. If you don’t see some of your favorite tools listed explicitly or by reference, don’t read too much into that detail.
    Major topics
     
  3. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    If you are currently infected:
    • Get focused and dedicated assistance at one of available sites on the Internet or a local shop. Some specific sites/guides that are available online include:


    • It is important to appreciate that random efforts to fix a problem can create more issues than they solve. The first maxim should always be do no harm. The second maxim is that you should always be able to undo any action. Casual users may not be equipped to do this (example – having hardware/software to clone any HDD being assessed).

    • If you are unsure of how to proceed, that’s a strong signal to get assistance, and that’s best achieved via focused support.

    • If, understanding those cautions, you still wish to attempt to tackle the situation yourself, consider the following:

      • Don’t attempt to install new software to the compromised machine. While it often works, spectacular problems can occur as a result of software installations in an inherently unstable environment. Use approaches which do not require installation on an infected machine. These approaches include:

        • Use of a portable software application which runs as a standalone executable. Check for availability among your preferred tools.

        • Use an available online scanner. Many of the major AV vendors provide this facility including: BitDefender, Kaspersky Lab, Eset, F-Secure, McAfee, Trend Micro, Avast! and so on. For the evaluation of single files, the online multiscanners VirusTotal and Jotti’s malware scan are available.

        • Linux based bootable live CD. This is actually a fairly powerful approach in that one works from a clean OS. Options include DrWeb CureIt!, rescue CD images which can be created from within security products (one example would be that provided by Kaspersky Lab). If your security software provides for creation of a bootable live CD image that can be updated at run time, create that disc and verify that it functions properly now. Then store it for possible future use.

        • Use of a secondary Windows boot installation. Provide each machine with access to at least two different boot devices – either a second OS installed on a second physical drive, or an external USB/Firewire drive which allows booting to a known clean system. This capability is useful for any circumstance which compromises a hard drive (malware infection, physical failure, corruption of needed OS files, etc.) and allows recovery and/or manual cleaning operations to proceed. Depending on the situation, these installations can be created either by cloning or a direct OS installation to the media.

        • If a second physical hard drive is not an option, go about creating a bootable Windows installation via Bart's Preinstalled Environment (BartPE) bootable live windows CD/DVD or Ultimate Boot CD.

        • Remove the infected drive from the host machine and attach it as a slave drive to a known clean machine from which diagnosis and remedy can be achieved.

        • Boot to safe mode and work from a reduced functionality environment. Depending on the nature of the problem, this may or may not be productive.

      • Be prepared for the any downside scenario. This often means copying valuable documents prior to initiating any work on the infected drive. Be aware that this attempt to safeguard information may serve as a mode of infection to other machines.

      • Try to secure access to a second machine with an Internet connection and hold it in reserve in the event your primary machine loses connectivity. Before the need arises, download a copy of Winsock XP Fix and check out Repair/Reset Winsock settings

      • If you are unsure of yourself, you need to assess how you will validate that the problem has been fully resolved. This is one reason to employ expert assistance.

    • Let's repeat the initial suggestion for emphasis - Get focused and dedicated assistance at one of available sites on the Internet or a local shop.
     
  4. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Points you need to assess and consider

    Although the primary focus of the site revolves around matters generally captured by the umbrella of computer security, that means much more than simply dealing with malicious software. This is really about computer/personal/data security – the availability and fidelity of computing resources and digital information that you rely on.

    What you need to think about:

    • What have you done to insure maximal uptime for your machine? While malware can impact system uptime, at some point in the future hardware failure will impact the usability of your computer. Have you planned for both eventualities? Regardless of how careful you are, one will befall you at some point and you need to be prepared beforehand.

    • The most significant computer issue that a user will face, and likely face multiple times, is physical failure of a hard drive in a system. In some instances the failure may be immediate and catastrophic, in other instances it may be over a timeframe which allows partial to complete data recovery from the compromised drive. Lack of a robust backup strategy is probably the single largest issue facing home PC owners today. We obsess over malware. However, malware is not where many users have experienced their most significant issues, it’s almost always been hard drive failure.

    • Moving back towards malware, what steps have you taken to minimize the possibility that your computing resources are being exploited for unauthorized ends? How would you even know if this is occurring? How would you discriminate a malware infection from a software conflict or some other problem not caused by malware?

    • How will you know if a piece of software is malicious or an example of an essentially nonfunctional rogue application that you’ve paid good money?

    • What active measures have you taken to protect your digital assets including: photos, purchased music, purchased download only software, personal/business documents?

    • How are you protecting yourself against identity theft/etc.?
    Some general design objectives to consider:

    • Appreciate the level of user interaction that is required for any solution and how this may play out in actual use. The need for continual user interaction and/or decisions made on the fly really should be minimized IMHO. Achieve your goals via intrinsic configuration and/or background automation where possible.

    • Maintain simplicity. Complex systems interacting in complex ways yield complex, unpredictable, and often unfortunate outcomes.

    • Stick with an approach long enough to understand how it behaves. Constant churn in a system configuration is a net negative since "normal" operating behavior becomes a moving perception.

    • When changes are made, change one variable at a time. It helps in troubleshooting if problems emerge later on.

    • Understand what you want to control and why it’s important.

    • Deal with reality, not hypothetical conjecture. When you are chasing solutions, verify that the solution is for a problem that actually exists.

    • However, be aware of changing realities. Yesterday’s hypothetical conjecture could be today’s reality. This is one reason that a single set answer to securing a system doesn’t exist – that answer is fluid with respect to the passage of time. General guidance is more enduring.

    • Finally, assume the worst outcome imaginable. What's your personal restoration/recovery plan? You do need one and it really needs to be solution in which you can start with a completely bare drive. Do you have what's required to accomplish this (i.e. install CD's, downloaded installation files, necessary drivers, activation serial codes or key files, etc.)? It is best to accumulate this information before a problem occurs. For example, one could store all electronic downloads on a dedicated drive. If key files are needed, copies can be maintained with the downloaded files. Serial key codes can be maintained in a master Excel spreadsheet (or other electronic file) for ready access. In some respects, a flat text file may be preferred. This information can be backed up as well.
     
  5. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Approaches to maintaining a clean system

    Computer security is about control, control of the following….

    • Operating fidelity of the computer to provide maximal system uptime. You want your system to work, work as intended, and work now.

    • Inbound/outbound communications. Your system should serve as your communication tool, not some other individual's platform. That means having some measure of control with respect to communications. In general, that means blocking unsolicited inbound, preventing undesired outbound communications, and maintaining encryption of wireless communications.

    • The privacy and security of personal information/credentials used to manage your affairs

    • Overall system performance
    There are a multitude of paths to nominally equal states of security. It is important to understand that while the paths may be very different, the realized goals may be virtually identical. The different approaches taken often reflect balancing different traits, relying on different levels of expertise, or favoring distinct design aesthetics.

    Invariably, to maintain system fidelity, you need to start with a known good state and keep it. This can be implemented system wide or in discrete areas of the file system. It can be a permanent change or invoked as desired. Examples include:

    • Native configuration/features

      • Objectively speaking, the default configuration of Linux and Mac OS-X workstations is suitable for out of the box use as is. The combination of package management, running as a non-admin user, and relatively small user footprint combine to make this an unproductive target. However, it should be recognized that even if this population were a large and valuable target, the base configuration of these systems is quite strong as is. Whether or not a blacklisting antimalware (AV/AS/etc.) is needed on Linux/OS-X systems been discussed in many venues, including here (see Antivirus Necessary For Linux ? or Antivirus for Linux). The simple answer is no – however, that is not a statement that these platforms are immune to security issues, just that the AV component is currently not a needed facet (a couple qualifiers - this assumes the machine is a simple end-user PC, not used as a file server for Windows PC's, and not used as any sort of Windows gateway - in which instance scanning for Windows based malware may be a desirable end goal).

      • Windows configured as a limited user. Using SuRun with XP or maintaining UAC active with Vista and Win7 yields a good generalized solution. WinXP limited user augmented with SuRun is basically how limited users should have been implemented. Most issues involving badly configured legacy programs can be dealt with, but a user really shouldn’t be forced to do this. Key links: SuRun: Easily running Windows XP as a limited user; Maximizing Windows XP security with LUA and SRP ; SuRun - SUDO in Windows – Tutorial; SuRun site; Maximizing Windows VISTA security with LUA and SRP (even without ultimate); For Users of Software Restriction Policies

      • Software Restriction Policies (SRP) are an excellent extension along the lines of using the native capabilities of the OS. The main issue is that SRP’s can require a higher level of user intervention and expertise. However, there are guides available to assist users of almost any level. For example, see Ruin a malware author's whole day with a Software Restriction Policy! : ) by mechBgon or PGS - Pretty Good Security by Sully. Either link takes the pain from SRP.

      • Strengths/Weaknesses: The primary strength of this category of approaches is that they take advantage of native capabilities of the OS. Arguably, the default configuration of Windows should go this route, and while that has partially happened with Vista and Win7, the “everyone’s an admin” boat left the dock a long time ago and that precedent is difficult to undo. Furthermore, sloppy coding practices have helped solidify the notion that casual users can (should?) run with Admin level credentials. The simple fact of the matter is that routinely running as an Admin level user is insane. The primary weakness (in Win XP) is that some level of user involvement, and potentially inconvenience, is required to implement limited user accounts and so on. By the same token, this is the default implementation in the Linux/Mac OS-X world. Eventually, this should also be the case in a Windows environment, unfortunately it appears that this will not occur until Windows XP has passed into disuse.

    • Execution control/Monitoring

      • Static execution control: This category fixes the configuration and set of applications that are allowed to execute. It is a pure default deny approach. As such, it is extremely robust. No decisions are taken on the fly. Commercially, the applications AntiExecutable by Faronics or Executable Lockdown from Horizon DataSys are examples of this genre. This is a powerful approach, but can be maddening in situations in which the installed application base changes on a regular basis. For systems in which a very static software configuration is the norm (public access PC’s in shops/academic settings, institutional computers), this approach excels and is highly recommended and extremely straightforward to use.

      • Dynamic execution control: This category of application allows you to render on the fly decisions with respect to application execution as well as the scope of the process. It is an exceptionally powerful scheme, but with power comes responsibility. To varying levels, providers of this type of application attempt to assist the user in rendering a decision. The structural problem with this type of application is the level of user knowledge and intervention often required. In some instances it is possible to reasonably quickly navigate this need (e.g. a session dedicated to the purposeful launch of the installed applications and configuration of the package). Examples of this category of application would include offerings such as Defensewall, GeSWall, and a host of other applications, some current, some abandoned. While powerful, it remains a niche type solution. However, if the application base on the machine changes on a frequent basis, this type of solution is much more suitable than the static execution approach described above.

      • Kareldjag's Security Overflow site contains a number of evaluations of this genre. See in particular An Overview of Personal Desktop/Host IPS, An Overview of Personal Desktop/Host IPS 2, and Host Intrusion and Prevention Tests. While much of the content was written a few years ago, the information remains useful today.

      • Strengths/Weaknesses: It is hard to argue with the concept of default deny. It is extremely robust, time tested, and transparent in implementation. Static execution control configurations are default deny, while the dynamic approaches are deny unless you change your mind on the fly. Both approaches have merit and it’s a trade-off between tightness of security and convenience in use. That’s a trade-off that weaves through the whole of PC security. On a truly static system, there is probably no more robust solution than static execution control.

    • Virtualization

      • Think of virtualization as “security” through the use of a transient and completely disposable computing environment. This isn’t security in the usual sense of the word, although there are very clear security implications to having “disposable” computing environments. Virtualization limits the scope of any malware incursion to a defined session and reduces system recovery to a restart level event, which for any user is a potentially enormous benefit. The scope of a virtualized environment runs the gamut from application level to partition based to a complete virtual system. As one might suspect, the required computing power increases with the scope of virtualized environment, but most current (2009) machines with suitable RAM are able to readily run any level of virtualization.

      • Application level virtualization or sandboxing: A clever, resource light approach to maintaining system fidelity is to allow applications to run in isolation from the remainder of the system, their own sandbox so to speak (hence the terminology). Sandboxie is the most recognized and probably the most popular variant of this genre. Some antimalware suites have implemented limited versions of sandboxing (e.g. Kaspersky Lab KIS 2010)

      • Light/partition virtualization – The next step in this category involves virtualization of complete partition. Typically, the default is for the OS partition to be virtualized, but some applications allow virtualization of other partitions as well. A couple of general threads are: Light virtualization: Returnil/PowerShadow/ShadowDefender/ShadowUser Pro and Light Virtualization - the first year..... These applications are fairly mature, but started to receive wider usage with the advent of being able to enter into virtualization without necessitating a system restart. While a seeming minor development, this removed a major user inconvenience in usage. Exit from a virtualized session still requires a system restart.

      • Full machine virtualization: Complete machine virtualization tends to be employed by users seeking to create and use disposable environments for software and other testing purposes. In general, CPU and RAM demands are somewhat high, but well within the reach of upper middle to high level consumer PC’s. A group of paid and free full system virtualization products (VirtualBox, VMware Workstation, VMware Server, VMware Player, Microsoft Virtual PC 2007) are readily available and have been discussed at length in a number of threads and external resources ( VirtualBox, Virtual Machines, Virtualization (one section of Mrkvonic’s informative site), VirtualBox 3.0.4)

      • Strengths/weaknesses: For most users, the most difficult and time consuming phase of dealing with a malware infection is the recovery phase. Virtualization schemes render this step fast and trivial. Weaknesses reside in how the user manages the virtualized state. Typically, full machine and partition virtualization rely on a user decision to enter a virtualized session. It's possible to do this as a matter of course, but that's actually somewhat unusual. Application level virtualization has somewhat more flexibility here in that Internet facing applications for which downloaded content is transient can be readily configured to always launch in a sandboxed environment. In all cases, users can employ a variety of approaches to continually work from a virtualized system and selectively retain information as desired. However, that starts a cascade of user based decisions, made on the fly, and this basically subverts a part of the basic reason for virtualization (facile session disposal/reset)

    • Rapid recovery/partition restoration

      • These applications are focused solely on maintaining maximum system uptime via the ability to quickly restore partitions to a system using a prior saved state. There’s some similarity between this class of applications, virtualization, and classical backup strategies with respect to the end result. However, in terms of the operating mechanism, user flexibility, and speed with which these programs can function, they are clearly a distinct class.

      • The gold standard is FirstDefense-ISR from Leapfrog Software, which is currently marketed by Horizon DataSys as FD-ISR Rescue, although an end of life statement seems certain in the near future (see Is First Defense dead?)

      • Horizon DataSys is also the vendor of RollbackRx (this program is also available under the EAZ Fix moniker in some locales – Horizon DataSys and EAZ Solutions are said to be ”sister” companies). RollbackRx works in a completely different fashion (via sector level redirection managing what’s effectively a phantom file system on the drive). The advantage of this scheme is exceptional speed to restoration (seconds vs. 10’s of minutes by other approaches). The downside is that disk manipulations performed outside the RollbackRx environment will corrupt the sector map and render the drive unusable. This detail needs to be appreciated by users of the product and it does limit offline manipulation of the system.

    • Live CD Linux

      • A Linux Live CD is a complete environment that can be booted from CD at any time and does not engage in any modification of the system. It is the classic example of a disposable computing session and provides a completely static and disposable environment.

      • Numerous offerings are available. For additional information, see the *nix resources: a list... thread in the all things UNIX subforum.

    • Classical blacklist AV/AS type solutions

      • This approach attempts to maintain a clean system by identifying known (or suspected) malware as it appears on a machine and removing it.

      • As a largely reactive approach, it is always playing catch-up. Given the explosive growth in malware, this eventually runs into a scalability problem of the response time being unable to keep up with the onslaught. That end-game has not appeared as yet, and it’s still somewhat off in the future, but it lurks as an almost statistical certainty.

      • Vendors in this segment have expanded their toolbox to include some of the other approaches mentioned above, and this provides appreciable benefit. Suites, which combine a number of blacklisting approaches with an integrated firewall and, more recently, features such as application control, allow consumers to implement multiple approaches under the guise of a single package.

      • The death of AV’s has been proclaimed for years – see from 2007, 2006, 2002, and there are likely earlier examples. There’s a trend here, and it still seems a premature burial.

      • Although malware is a global issue and the spread of malware is certainly aided by the connectivity of the Internet, geographic and language localization can matter. For this reason, don't dismiss smaller local offerings that are developed for and concentrate on supporting local markets. Specific examples would be offerings in areas where English is not the dominate language (e.g. China, Eastern Europe, etc.). Vendors based in these areas may not possess the brand name recognition of the larger global vendors, but they can be quite suitable in the local region since their offering will generally be focused on locally active threats.

      • The purpose of an AV is to let you know when it sees a file that’s malicious. As a blacklist, it does not validate as good, it identifies as bad. However, absent an expert file identification system such as an AV or related product, how can a regular user identify malware?

        • Code analysis of the disassembled file? Ummm…, right…
        • Running the file in question and hoping for the best? Chancy in most situations.
        • Rely on download source? Actually not bad (compare with package management under Linux), but not always obvious.
        • What are we left with then..., there really isn’t a good mechanism for regular users to assess whether or not a specific file is malware save for a blacklist type approach, even with all its negative attributes. It’s here to stay in some form as long as uncontrolled content can be downloaded from the Internet.
        • Finally, if you need to ask someone else "Do I need an AV?", the default answer is most assuredly yes. When you're comfortable with your own counsel in this area, you may be ready. I'll leave open the possibility of foolhardy, but if you've seriously assessed the relevant material, you should be equipped to deal with most eventualities if you choose to go that route (by the way - it's not a path I recommend on Windows based machines).

      • Which to choose? Here are some Wikipedia and other based listings of some options in the various program categories:


        Almost all of the major vendor products will yield operationally equivalent behavior in most users' hands. They all cover the widely circulating examples of malware to which you're most likely to fall victim. If you happen to expect malware challenges on a frequent basis (note – realistically frequent mean more than a few times per year) perhaps a best-in-class product as indicated in head-to-head testing (see challenge tests by AV-Comparatives.org or AV-Test.org) should be your product choice if you employ this approach. Note that the various listings mentioned need to be constantly refreshed to maintain current information. For that reason, consult active forums (e.g. Other Anti-virus Software for the most current information and AV Screenshots Thread for a peek at what many of the offerings look like in use).
    It should be apparent that maintaining system fidelity can be achieved by many paths, some very divergent in approach, others with multiple overlapping traits. The key thing is that all these schemes work, some work best in specific situations, some can be readily combined, and others are combined at the users' risk.

    It's also true that one can dispense with any of these solutions altogether, and experience no ill consequences. However, simply because you can successfully do something does not mean that it's prudent to do so. One example hierarchy of solutions could be as follows:

    • Start running under a limited user account (XP) or keep UAC enabled (Vista/Win 7). Under XP, install SuRun capabilities. Note, some software still does not work well within a limited user account. SuRun can generally resolve this compatibility issue in a facile fashion. If LUA/UAC are so great, why aren't they used universally? There are likely many reasons and rationales, but three major ones are....

      • With Windows XP, LUA is not an enforced default. If you want to direct someone to do something, force them to make an active decision to not do it, in other words make it the default.
      • Sloppy coding that "requires" administrative level privileges when they're not really needed.
      • Hubris. That may appear to be a harsh assessment, but it seems borne out in many LUA/Admin level usage discussions held in the past.

    • Use an AV. Implementation details (real-time monitoring vs. demand only scanning vs. web-based online scanner or web multiscanner) are left to the user. There are a range of configurations one can use, spanning from continuous real-time monitoring to simple demand only evaluation of specific files. In all cases the role of the AV is the same - it acts as your personal expert system to evaluate whether a file has malicious traits or content.

    • Consider adding a virtualization solution if you frequent seedier elements of the web. For casual usage, application level/sandboxing or partition level virtualization is recommended. For software testing, go with full machine level virtualization.

    • If you like to play with malware (self education, etc.) or desire to exert complete control over a machine, employ some type of dynamic execution control measures and employ a firewall that easily allows very granular control. If you go this route, in general you've decided to replace an AV's expert evaluation with your own in response to alerts regarding execution control.

    • Finally, although anyone could argue that LUA + AV + virtualization + HIPS + firewall used in combination could yield a really secure machine (and yes, it would in principle), don't go down the road of implementing an excessive patchwork security construct. There are too many downside possibilities.
     
  6. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Communications

    A large part of security clearly involves developing a sense of the inbound and outbound communications that your computer is engaged in. Malware is often used to acquire information, computing resources, and so on for untoward purposes. Each of those objectives requires establishing communication with the outside world.

    It's been noted in many places, but will be repeated here for emphasis, if you are dealing with the control of unwanted/undesired communications your machine is already compromised. You are not eliminating the compromise by controlling the communications, you are mitigating the range of consequences. That is a very beneficial end result, but the scope of the result needs to be fully appreciated. Control of communications provides containment, not necessarily remedy, of malware.

    There are a number of specific approaches that can be used to monitor and control communications. One detail to keep in mind - if control is exerted off the computer, you are dealing with packets/ports/destinations/and perhaps glimpses of information. Control exerted from on the computer is able to exercise a much higher level knowledge regarding the communication. For example, the specific application participating in the communication will be known.

    With regards to specific approaches:

    • Router: A router is not a security device, but its functional behavior has clear security implications. In very basic terms, a router will automatically reject unsolicited inbound communications to your PC. It will not reject solicited, but malicious, communications. The benefits and robustness of the operational hardware firewalling provided by a router are covered well in discussions such as El Cheapo Router Challenge and First winner - El Cheapo Router Challenge. Every home broadband user should employ a router. An excellent site to visit for coverage of router (and other) hardware is SmallNetBuilder.com. In particular, visit the Wireless Performance Charts section if you're actively shopping for a device.

    • Software firewall (client): For the majority of average users, use of the Windows based firewall or the firewall component of a security suite is generally more than enough. Specialized/dedicated firewalls generally afford much more latitude with respect to the granularity with which communications can be managed and controlled via the development of detailed rules by application, port, or protocol, among other possible variables. The primary difference between the communication control provided by a router and a full featured firewall is as follows:

      • A router is a separate dedicated hardware device. The computational load associated with it has no impact on PC resource load. It is therefore effectively a means of resource load balancing
      • A router deals with packets and packets only. It has no direct knowledge of the application generating the communications. Rules to control communications will therefore tend to be time (day/hour/net amount) and protocol (tcp/ip arp, by port number, etc.) based.
      • If you really wish to actively control communications between your PC and the outside world, a software firewall is a must.

      Users contemplating use of a software firewall should consult some of the following resources:


    • Hybrid solutions: Instead of a dedicated hardware router, if you’re in possession of an older PC, this can be converted into a somewhat more powerful and flexible router solution using readily available packages such as Smoothwall, m0n0wall, pfsense, IPCop, or Endian. These and related products can be used to turn legacy hardware into flexible firewall/routing solutions.

    • Wireless communications: With the increasing prevalence of laptops, netbooks, and other small devices enabled with wireless communication, the management of wireless access is a factor that needs to be addressed at some point. Although Bruce Schneier has written about My Open Wireless Network (see also Terrorists Using Open Wireless Networks as counterpoint by Schneier as well), the simple fact of the matter is that personal wireless networks should be closed access with a decent level of encryption enabled. As a base level of security, the ability to remotely administer a wireless router should be disabled as a matter of course (this is the typical default) and WPA2 encryption should be employed. For the generation of various keys required for encryption, a number of convenient sources are available on-line ( SpeedGuide.net WLAN Key Generator or WEP/WPA key Generator from Soroban Systems are representative examples).

    • Are any of these solutions needed?

      • At the very least, any home with computers on the internet should employ a consumer level router. They are cheap, effective, remove computational load from the user’s PC, and are robust.
      • If wireless connections are used in the home, the link should be encrypted with WPA2 level encryption.
      • If the user feels that active control of communications on a process/application basis is required (and this could be as simple as denying internet access to all but a select set of applications on a per application basis), then a software firewall should be used. Note, this is the level of security at which user intervention has significantly increased. The prior two items are generally rapidly implemented and left to function. The step to implementing a coherent software firewall is much larger in most instances.
      • If spare general purpose PC’s are available and unused, tasking one as a dedicated router/firewall may be beneficial from both security and educational perspectives.
     
  7. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Security of personal information.

    The prior discussion has focused on what might be considered the digital security of a system and, by extension, the information stored on it. Here we consider the actual physical integrity of the system and how users should approach that topic. It is often thought of as maintaining backups of data, but it really much more than that. The objective is to put into place an infrastructure that assures maximal system uptime and maximal integrity of any final or in-process work being performed on your systems. There are a number of discrete dimensions to be considered, but a scenario in which your primary system drive exhibits unpredicted and catastrophic physical failure captures the key elements that one needs to consider. If this occurs, one needs to contemplate the following:

    • Let’s be clear from the start, unlike the discussion on malware infection which may never occur, physical failure of a hard drive is only a matter of time. It will happen to you. It may occur within the first year, 5 years down the road, or perhaps after 10 years. However, it will happen. It is only a matter of time. Too many users fail to fully appreciate the consequences of this simple reality.

    • If a physical device fails catastrophically, you need to assume that any information contained on it is lost. Subdividing a large drive into multiple partitions won’t help you here. Information needs to be retained on a separate physical device. With the current cost of hard drive media, and the potential value of digital assets, there should be no question regarding the use of separate physical devices for backup of files.

    • Hidden restore partitions that vendors like to use as a simple cost and support savings measure fail in this scenario. Do you have an alternate path to recreating a complete base OS installation of your system on bare metal? If not, you need to plan for this eventuality.

    • In many instances, it is not only required that you have (only in principle in many instances) the means to restore full functionality to your machine, you need to be able to do that quickly. A scenario that is useful to consider is the poor university student facing a deadline on a major project when disaster strikes. Depending on the circumstances, that deadline may not be movable. What steps could a student preemptively take to best manage through this possibility? Disaster could be a drive failure, loss of a laptop, or theft of a laptop. In all three cases, needed information and work product is gone and needs to be rapidly replaced.

    • A simple reality that we all need to embrace is that the world, and our assets in this world, is increasingly digital in nature. Assets such as:

      • Personal photographs and documents that have mainly sentimental value – which are not replaceable if lost.
      • Purchased electronic media – music files, software downloads with serial keys and/or key files.
      • Contact databases and email
      • Important financial and tax records

      The listing of these categories is to make a point – there’s a lot of material which, if lost, could result in extreme levels of inconvenience to outright financial loss. This material needs to be protected against loss.
    Maintaining a viable backup strategy.

    This is the point at which decisions need to be taken. What should I do? To coherently answer that question, you need to assess a few details:

    • Am I trying to preserve a system with static content? This could be, for example, a PC in a public café, a simple walk-up access machine, and so on. You need it to work but no user documents are meant to be stored on this machine. In this case, all one needs to do it is to maintain either (a) the needed OS/application reinstall discs and information (activation codes, etc) or (b) a physical clone of the full operating partition. As purely a matter of speed, option (b) is probably preferred since the timeline from total failure to total resurrection is simply the time to reclone the HDD.

    • If the content of a machine is dynamic, what’s the typical timescale of changes? If it’s infrequent, a simple clone or backup plus scheduled incremental backup strategy may suffice.

    • An observation - it seems a truism that security/backup solutions that rely on continual active user interaction/initiation to function are prone to eventual (user based) failure. In both security and backup, one is managing the risks associated with a low frequency event. Performing daily backups for 4 years gets old, so you stop, only to be subject to failure in year 5. One needs to eliminate the “it gets old/boring/whatever” eventuality from happening. This is the type of circumstance in which automation, coupled with notification in the event of exceptions, does seem the best (or at least a prudent) course of action.

    • Solutions. Potential solutions are as numerous as the stars. Some are better, some worse. Examples include:

      • With an Apple Mac, get a Time Capsule and use the OS-X based Time Machine. This approach, coupled with creation of an external drive bootable clone (using Carbon Copy Cloner), provides constant, unattended, system backup with immediate access to a secondary bootable image. The time from full HDD failure to being operational is simply the time to restart over a Firewire connection.

      • For a small (< 10 machines) home network, a Windows Home Server system provides an excellent solution with decent price scalability for even a handful of systems. Options range from an older system recast as a home server running a copy of Microsoft Windows Home Server or a dedicated headless system (example – HP MediaSmart server). For ease of use, a preconfigured system is ideal. Whether this makes sense depends on a few factors including price point sensitivity.

      • As the number of machines needing backup decrease, the most cost effective solutions change from networked server, to simple network attached storage, to a single machine dedicated internal or external physical drive. Each of these approaches work at any scale, it’s simply a matter to understand the cost and management trade-offs as the scale of the operation changes. Despite the power of network based options, a machine dedicated approach is generally best for casual home users, and that generally reduces the issue to acquiring appropriate software, a hard drive, and configuring the system to achieve the desired results. Specific details to consider:

        • Internal vs. external drives is a fairly neutral decision involving trade-offs of convenience, flexibility, and speed. Choose one.

        • Have a firm recovery scheme in place and verification that it will work.

        • Assume complete loss of the primary system HDD. Do you have a boot recovery option? If the vendor provides for creation of a bootable rescue CD, have you made one and verified that it works as needed?

        • Do you have a verified image that can be used to reconstruct your system quickly? How do you know it will work?

        • Do you have a location in which all key information needed to resurrect your system is maintained? For example, do you maintain all software serial keys in a single file (Excel for example) so that reinstallation of any software – old or new – is a facile event? If the application uses a vendor provided key file, is a copy is retained in a separate download folder specific to that vendor with previous versions of that application you’ve downloaded. All versions should be kept on hand as a guard against the emergence of future incompatibilities

        Understand that assumptions made prior to a catastrophic event may not be addressable after the event (HDD failure) has occurred. Understand the initial assumptions made, question them, and adjust as needed. Finally, do you have a plan B? There are simply too many example threads on this site in which, in the heat of anxiety and confusion, a user has experienced a problem and has ended up blowing away their recovery option in their efforts to repair the system.

    • Software Options:

     
  8. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Balancing Security, Performance, and Usability

    One topic that gets too little attention is the tension between security and a number of other areas such as user convenience and system performance. Achieving a realistic level of security involves an active balancing of trade-offs. There is really no way around that equation. In many cases, the discussion here focuses exclusively on the potential security benefits which are gained. It is less common to see an equally developed discussion of the inconveniences or performance downgrades possible. Some specific issues to consider in weighing this balance…
    • Read some of Bruce Schneier's writing at his site. Some pieces that are particularly recommended include:


      Schneier's site is highly recommended to not only understand some of the technical issues confronting security as a general topic, but some of the underlying reasons that we seem to behave as we do, and how understanding that can help.

    • How frequently does the software environment on your machine change? This could involve anything from serial beta testing of anything that appears on the Internet to a machine which has not seen a software or OS update in ages. One needs to appreciate that these extreme cases can employ fundamentally different strategies to achieving reasonable levels of security.

    • Users need to get past the ethic in which they feel compelled to respond to each and every mention or rumor of a threat on the Internet. It is important to develop a sense between pure noise and genuine shifts in needs. The hardest part of this balance is developing an appreciation of whether or not that latest increase in activity is a large noise spike or the emergence of today's zero-day threat. There are no absolutes here, the simple fact of the matter is that you probably won't know in advance. That's a reality and a fact you need to plan for by, for example, implementing a robust backup plan. When people speak of layering, it should be in this guise - genuinely complementary solutions.

    • Since users can assemble a multitude of solutions from the partial approaches mentioned above, it's useful to mention schemes that are generally not recommended. This would include:

      • Installation of multiple real-time AV/antimalware monitors is decidedly not recommended. Too many problems possible. Even if all but one AV is set for demand scan only, there are too many issues lurking in the background. Don't do it. The return is meager and the downward risk is high.

      • Users really shouldn't run at in an Admin level account for routine usage. This is tempered by the fact that some legacy programs needlessly require Admin level access.

      • You should refrain from assembling a security solution as a serial patchwork of products that are slapped one on top of another to plug a series of perceived defensive holes. Look at the forest, don't simply keep planting new trees without a global plan guiding the effort.

      • Try not to obsess over inconsequential differences. Naturally, to get to this point you need to be able to determine what's consequential. If that AV which detects 99.9% of malware it was challenged with really better than the one detecting 99.5%? 95%? 90%? It's rather easy to make the case that differences that seem to cause much anxiety here are actually quite inconsequential.

      • Users shouldn't rely on approaches that are not understood. One doesn't need to know all the intimate details of functionality, but how the result matters. For example, a classical AV will flag a flag as malware or suspicious. The user really doesn't need to understand all the computational detail behind that determination, they simply need to appreciate that a determination has been made, thumbs down has been given, and an action needs to be taken (if offered). Contrast this to a situation in which an alert regarding a potentially malicious action (say altering a registry value) is being initiated. For a user to intelligently deal with that alert, the level of system knowledge required is actually fairly high. An advanced user will generally be equipped to deal with the situation, a casual user won't.

      • Users shouldn't pursue a strategy in which each and every hypothetical floating out there is comprehensively addressed. There is a point of diminishing returns that is quickly reached. The primary issue is that while hypothetical attacks are seemingly addressed, the complementary needs to facilitate a recovery operation (e.g. system backup) remain woefully neglected. Too few seem to approach both ends of the problem with equal force.

      • Finally, while the primary focus here is data retention, privacy concerns tend to focus on data removal. Since the topic keep reappearing on a regular basis, let's take a moment to note that Secure deletion: a single overwrite will do it. No, you really don't need to overwrite files 35 times to eliminate them. Once is enough. However, be aware of persistent metadata, MFT entries, and cached information that leave residual ghosts of the information you've casually presumed that 35 time overwrite has eliminated. If there's a privacy problem, it is almost certainly to reside here and not in how many times you've overwritten a given file.

      • By the same token, facile recovery should not be confused with security. An insecure system that can be disposed of and replaced via a simple restart remains insecure. To be sure, the scope of potential damage is quite limited, but this limit in the scope for potential damage should not be mistaken for security.
     
  9. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Common hardware/software configuration issues

    One primary symptom that users rely on in assessing whether or not a system is infected with malware is overall performance of a PC. That performance assessment may reflect periodic stalls, system crashes, and so on. While many of these symptoms can and do reflect compromise of a system by malicious software, simple hardware and software issues can lead to the same type of observations. When faced with a balky machine, some specific items to consider that can yield an unstable system include:

    • If the CPU has a fan, is the fan working properly? Overheating of the CPU generally yields system crashes and/or pure lock ups. Usually the system will behave fine from a cold start, fail sometime later, and experience variable results on attempted restarts. The problem may appear intermittent, but will often be tied to the computational load being placed on the system as well as ambient temperature and/or season of the year. To diagnose: run the system while the case is open to verify that the fan actually works. As a moving part, a fan is one of the components that have an expected high rate of eventual failure.

    • If fans appear to be working, are they able to adequately deliver airflow to the needed areas. It is important to understand that the airflow through a computer case over a number of years can deposit a large amount of dust/lint onto a system, providing thermal insulation to covered components or disrupted airflow through case openings. If the inner workings of your PC are encased in dust, carefully remove that dust. Generally, a vacuum with a fine nozzle attachment held away from the components provides more than enough suction to do the job. Pressurized air/gas will also do the job, but generally creates a minor dust cloud and a mess outside the case in the process.

    • Is a CD/DVD drive not working? Check that the read/write laser is working and that the lens system is not scratched or covered with dust. A lens cleaning CD will often help with dust issues. Laser issues are often apparent by verifying that a disc is readable on all CD/DVD drives except the questionable one. The only reasonable course of action here is replacement of the drive.

    • Verify that the system has sufficient RAM for the task load that is run. Compare the total and peak commit charges against total physical RAM under the Performance tab of the task manager. You don't want to be working in a regime where the commit charge routinely exceeds physical RAM.

    • Check that file system errors are not being logged to the Event Viewer (under Administrative Tools). If file system errors are apparent, locate the cause sooner rather than later. If it is a problem with file system consistency, recovery is very possible early on, but the chance diminishes over time. Run CHKDSK to verify integrity of the file system.

    • If system performance is lagging, the most obvious first step is to identify the processes that are pulling the greatest fraction of CPU cycles. The processes which draw the most significant fraction of CPU uptime may not be the critical issue, but they may be indirectly related. In any event, it will narrow the field. The Windows Task Manager is obviously a place to start examining system process performance issues, but applications such as Process Explorer can be used to glean greater insight into details surrounding the various processes running on your system.

    • Assess the set of applications that are always launched on system start. If there are infrequently used applications which are always running, this is generally an area in which to trim utilization of system resources.

    • Is your system littered with gigabytes of temporary files and so on? Take a moment to perform a small amount of system hygiene by running CCleaner. The more adventurous may entertain a bit of registry straightening and examination/pruning of startup entries as well using this rather handy utility. Note - for those situations in which you do wish to install yet another application (for example, suspected system instability), a portable version of CCleaner is available from the other builds section of the site.

    • Do you want to "refresh" your OS installation but don't want to rebuild the system from scratch? Then How to Perform a Windows XP Repair Install may be just what is needed...

    • On the other hand, while it seems somewhat popular, the tendency to strip down Windows services to a bare minimum is not recommended. The problem lies less in issues which crop up immediately, but in unanticipated behavior down the road for applications which assume some of the disabled services are running. If you're not an advanced user, this is generally a tweak better left untouched.
    The underlying message is that you shouldn't automatically presume that all unexpected or unanticipated issues experienced using a PC is due to malware. Consider, explore, and rule out the more mundane explanations before leaping to the conclusion that you're infected. You may very well be infected, but similar symptoms can arise from many sources.
     
  10. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
  11. BlueZannetti

    BlueZannetti Registered Member

    Joined:
    Oct 19, 2003
    Posts:
    6,590
    Summary

    Let's try to pull much of what is presented above together....

    • Backup, backup, backup... Millions of electrons have been spilled discussing malware and how to deal with it. Data retention in the home environment has been almost an afterthought for most users. It's time to push that afterthought to the front. Malware will certainly be an ongoing battle, but that's only half of the equation towards guaranteeing your digital assets.

    • Don't react to noise... While it's prudent to adjust any strategy you've implemented to a changing situation, make sure that the situation is really changing before making adjustments.

    • Understand how your system behaves...., and as a corollary, if it is changing all the time, you'll have a very hard time meeting this goal. This is to try to assist you in developing an appreciation for when something has really changed in the background.

    • If you do make changes...., make the changes sequentially, one at a time, and observe behavior for at least a short period of time between changes. Multiple changes made in parallel create a situation that is much more difficult to debug.

    • You system is unique.... It may be in the software you generally use, the selection of hardware, or simply the mode in which you use it. If you wish to really understand how an application or an approach will behave in your hands, you need to run the classical challenge-response experiment. You can develop a general appreciation via discussion with others but, ultimately, your computing ecosystem is unique.

    • Embrace simplicity...., if only to help you down the road if you face a debugging situation. While the exotic multidimensional security configurations have the appeal of apparent comprehensiveness (and often they are...), one really needs to ask at what cost? This comment shouldn't be interpreted as a dismissal of specific any approach (say AV vs. HIPS, for example). Each has a role. I simply believe that this role does not need to be one in which every permutation is layered and guarding the potential weaknesses of all other components. Every hypothetical eventuality does not need to be addressed a priori by everyone. Take sensible precautions, pay attention, be aware, and you should be fine. Implement a few measures well rather than many measures in haphazard fashion.

    • Always have Plan B at the ready.... Whenever you make changes, install software, try beta level software, and so on..., have a path to recovery available (yes, we've returned to backup, backup, backup...)...

    • As with virtually anything you read on the Internet - perform your own due diligence and vetting of the information presented above. Pressure test the material against your own understanding of the situation. Some items may be badly presented, others may implicitly presume factors not applicable in your own situation, while others may genuinely benefit from further technical refinement by the user community.

    • Finally, if members would like to further discuss some of the key areas mentioned above, by all means open a new thread in the appropriate forum for discussion.
    Lists and Resources

     
Thread Status:
Not open for further replies.
  1. This site uses cookies to help personalise content, tailor your experience and to keep you logged in if you register.
    By continuing to use this site, you are consenting to our use of cookies.