Basic Windows PC Maintenance Tips


Most small business owners don’t know much about basic computer maintenance and as a result, their PCs slow down or crash. The real issue is neglect: failing to update security patches and antivirus software, overloading the system with trial software.
Of course, many small business owners don’t know much about cars either, but they know to give it gas, change the oil every so often and to keep an eye out for flat tires. It’s the same with PCs. You don’t need to be an expert to keep your PC in relatively good condition. You just need to perform a little basic PC maintenance and, more importantly, be observant.
Here are seven simple basic steps that you can take to keep your PC running quickly and reliably.

Keep Windows Updated with the Latest Patches
Since Windows 98, Microsoft has provided access to Windows Update. Windows Update scans your system and updates it with the latest security patches and service packs. These are broken down into Critical and Recommended updates.
A new version of Windows Update, Microsoft Update, is also available. In addition to Windows, Microsoft Update will also patch a wide variety of Microsoft applications, such as Office and Windows Defender. Best of all, you can schedule these updates to run automatically, so there is really no excuse for not having a patched system.  To access Windows Update click on the Start button, All Programs and scroll through the list to find it.

Keep Your Spyware and AntiVirus Programs Updated
No matter how good your spyware and antivirus software is, if it’s not updated or, worse, not running at all, then it won’t do you any good. Most antivirus applications load an icon in the Windows tray, which lets you verify its status at a glance. Always verify that the application is running after starting Windows.
In addition, these applications should be configured to perform definition updates everyday and complete system scans should take place at least once a week. Should you need a new antivirus scanner, (e.g Avira Antivir). Not only is it free, but it always performs near to, if not at, the top of most comparison tests.
To combat malware, A-Squared and Malwarebytes' Anti-Malware are effective. Both are critically renowned for their ease of use and effectiveness; and they’re free.

Keep Your Applications and Utilities Patched
Believe it or not, all of the applications and utilities on your system are prone to security risks and need to be updated regularly. Programs that you use everyday like Adobe Acrobat Reader, QuickTime, Realplayer, Skype, WinZip and more require both maintenance and security updates from time to time.
Even applications that run in the background like Flash and Java are at risk. Trying to keep track of each of these individually can be a bit of a handful, but a nifty utility called Secunia PSI makes the job much easier. This free utility tracks a massive number of security exploits in applications and will automatically monitor your PC for susceptible apps. When it finds one, it directs you to a site where you can download and install the needed patches. This program is an invaluable resource for keeping your PC secured.

Remove Unused Applications and Other Junk
Your PC has a lot of non-essential data (e.g., crap) stored on it, much of which you might not even be aware of. For instance, Internet Explorer stores copies of the Web pages you visit, images and media for faster viewing later. Plus there are temporary files, your Internet history, cookies, and more scattered throughout your system.
Plus, when your machine was brand new it came pre-loaded with numerous pieces of trial software. This could be games, security suites, even full blown applications like QuickBooks or Microsoft Office. Many people never install these. Others have, but decided not to purchase them at the end of the trial. Yet they remain on the system, wasting space and bloating the Windows Registry. Over time, this can lead to performance problems, causing Windows to become sluggish and unreliable.
One of the easiest ways to combat this is to use CCleaner, a freeware utility for system optimization, privacy, and cleaning. This tool removes unused files from a hard drive and cleans up online history. But more important, it includes an outstanding registry cleaner. It even has an uninstaller to assists you in removing applications from your system.

Pay Attention to the Software You Install
Many applications, especially freeware, often attempt to install additional software on your system.  For example, RealPlayer gives you the option to install Google Chrome.
Google Chrome is a good web browser so. However, some applications also try to install stuff you don’t need, like an additional toolbar in IE (internet explorer). In almost all cases you’ll be asked whether or not you want this extra software installed.
The trick is, that YOU MUST PAY ATTENTION DURING THE INSTALLATION and actually read those screens that popup with the words on them and NOT just mindlessly click the “Next” button until the process finishes. If you follow this tip you can be sure that the amount of junk installed on your system will decrease.
And should you find something installed without your authorization, uninstall it immediately. If it won’t uninstall, use Window’s System Restore feature to revert back to an earlier configuration. This brings us to our next tip…

Create a System Restore Point
Before you install any new software on your system, always create a System Restore point. Some software can play havoc to your system causing all sorts of strange problems. System Restore helps you restore your computer's system files to an earlier point in time when your system was working well.
It's a safe way to undo system changes to your computer without affecting your personal files, such as e‑mail, documents or photos. Having a restore point can significantly reduce your downtime. Plus this functionality is built right into Windows so there is really no reason not to do it.
To create a system restore point go to Control Panel and select Backup and Restore. Windows 7 users click “Recover system settings or your computer”. Vista users select “Create a restore point or change settings.” After you have created a restore point, you can access and use it easily through CCleaner.

Defragment and Check Your Hard Drive for Errors Regularly
In order to help maintain the integrity of your data there are two hard drive tests that you should run at least once a month. The first is to Defragment your hard drive. Over the course of regular use, your files get fragmented or spread out all over your hard drive. So while an MP3 or WMV file appears as a single file to you in Windows Explorer, small pieces of the file could literally be spread across the entire hard drive.
Gathering all of these distant pieces back together into a single contiguous file makes file access faster. Depending on how fragmented the data on your drive is, defragmenting it could make your system noticeably faster.
The other test is Check Disk. This tool checks hard disk volumes for problems and attempts to repair any that it finds. For example, it can repair problems related to bad sectors, lost clusters, cross-linked files and directory errors. Disk errors are a common source of difficult-to-track problems, and running this test regularly can significantly reduce your risk of problems.
Windows has a built-in defragmenter and check-disk utility. To access either of them just open Windows Explorer and right-click on the drive you want to examine. Select Properties and then click on the Tools tab. To defragment your HD go to the Defragmentation section and press the Defragment now button. To perform a check disk, go to the Error-checking section and press the Check now button.
Certain free third-party defragmentation utilities have some significant advantages to the one built into Windows. For instance, both Ultra Defrag and Smart Defrag perform the job much quicker than the built-in version. You can schedule them to run automatically and transparently in the background while you work. Try them both for yourself.
You don’t need to be a computer expert to keep your computer running well. Resolving these issues doesn’t have anything to do with understanding computers. It has to do with paying attention to what you’re doing and actually reading those messages that popup on screen during an installation. Just follow these basic steps, and I guarantee you’ll computer will be safer and far more reliable.

Daily Computer Maintenance Tips:
Update your anti-virus scan and anti-spyware definitions, if they aren’t being done automatically.
Backing up any critical files that you have changed today to a portable removable media is important.

Monthly Computer Maintenance Tips:
Clean up your temp files, your temporary internet files, and other junk files about once a month. To do this easily, you can either download my favorite program for cleaning, CCleaner. Or you can run the built-in Windows Disk Cleanup tool for XP or the Disk Cleanup for Windows 7.
Ensure you have the latest Windows updates installed. Go to Internet Explorer, Tools, Windows Update. Click on the Custom button. Always use the Custom button so you can check what’s going to be installed before it gets installed. Windows update may ask you to download and install the latest version of itself. Go ahead and do that, then click Close when it’s finished, and then Continue. It will then check again for real updates and offer those. Choose which updates you want to install. Uncheck the ones you don’t want to install.

Clean out your email, paying special attention to your Inbox and Sent box. The easiest way I’ve found is to sort your email box by size of the message, and delete the largest unneeded emails first.
Spyware: is a type of malware (malicious software) installed on computers that collects information about users without their knowledge. The presence of spyware is typically hidden from the user and can be difficult to detect. Some spyware, such as keyloggers, may be installed by the owner of a shared, corporate, or public computer intentionally in order to monitor users.

Malware: Malicious computer software that interferes with normal computer functions or sends personal data about the user to unauthorized parties over the Internet.

A Computer Worm: is a standalone malware computer program that replicates itself in order to spread to other computers

Service Delivery Processes


-Service Level Management

Service level management provides a mechanism to align the IT services with the business requirements. Service level management provides a structured way for customers and providers of IT services to meaningfully discuss and assess how well a service is being delivered. The primary objective of service level management is to provide a mechanism for setting clear expectations with the customer and user groups with respect to the service being delivered. Activities included in the process are creating a catalog of services, identifying requirements, negotiating SLAs, and managing service continuity, availability, capacity, and workforce.

-Financial Management

Financial Management process introduces the concept of Budgeting, Accounting and Charging for IT services delivered to the customers. Budgeting and Accounting involves understanding the cost of providing various services. Financial management ensures that any IT service proposed is justified from a cost and budget standpoint. This is often referred to as a cost-benefit analysis. Activities performed within the function include such standard accounting practices as budgeting, cost allocations, and others. The concept of charge back allows internal IT departments to function as a business unit, controls non essential demands from customers and allows customers to demand value for money.

-Availability Management

The goal of availability management is to ensure that the IT services are available to users when they need them. Availability is calculated and reported as percentage of the agreed service hours for which the service was available.

-Capacity Management

Capacity management activities include planning, sizing, and controlling service solution capacity to satisfy user demand within the performance levels set forth in the SLA . This requires the collection of information about usage scenarios, patterns, and peak load characteristics of the service solution, as well as stated performance requirements. These data collection activities are included in the Capacity Management process.

-IT Service Continuity Management

IT service continuity management, also known as contingency management, focuses on minimizing the disruptions to business caused by the failure of mission-critical systems. This process deals with planning to cope with and recover from an IT disaster. It also provides guidance on safeguarding the existing systems by the development and introduction of proactive and reactive countermeasures. IT service continuity management also considers what activities need to be performed in the event of a service outage not attributed to a full-blown disaster.

Mobile Technology

Mobile technology is a collective term used to describe the various types of cellular communication technology. Mobile CDMA technology has evolved quite rapidly over the past few years. Since the beginning of this millennium, a standard mobile device has gone from being no more than a simple two-way pager to being a cellular phone, GPS navigation system, an embedded web browser, and Instant Messenger client, and a hand-held video gaming system. Many experts argue that the future of computer technology rests in mobile/wireless computing.

The United States Military is now using Mobile technology as a tool for information dissemination and collection in the battlefield arena. "Numerous agencies including the Department of Defense (DoD), Department of Homeland Security (DHS), Intelligence community, and law enforcement are utilizing mobile technology are utilizing mobile technology for information management."

Contents

4G networking

One of the most important features in the 4G mobile networks is the domination of high-speed packet transmissions or burst traffic in the channels. The same codes used in the 2G-3G networks will be applied to future 4G mobile or wireless networks, the detection of very short bursts will be a serious problem due to their very poor partial correlation properties. Recent study has indicated that traditional multi-layer network architecture based on the OSI model may not be well suited for 4G mobile network, where transactions of short packets will be the major part of the traffic in the channels. As the packets from different mobiles carry completely different channel characteristics, the receiver should execute all necessary algorithms, such as channel estimation, interactions with all upper layers and so on, within a very short time to make the detections of each packet flawless and even to reduce the clutter of traffic.

Operating systems

There are many types of Smartphone operating systems available, including: Symbian, Android, Blackberry, WebOS, Apple iOS, Windows Mobile Professional (touch screen), Windows Mobile Standard (non-touch screen) and Bada OS. Among the most popular are the Apple iPhone, and the newest - Android. Android is a mobile operating system (OS) developed by Google. Android is the first completely open source mobile OS, meaning that it is free to any cell phone carrier. The Apple iPhone, which has several OSs like the 3G and 3G S, is the most popular smart phone at this time, because of its customizable OS which you can use to download applications ("apps") made by Apple like games, GPS, Utilities, and other tools. Any user can also create their own Apps and publish them to Apple's App Store. The Palm Pre using WebOS has functionality over the Internet and is able to support Internet-based programming languages such as CSS, HTML, and JavaScript. The BlackBerry RIM is a SmartPhone that has a multimedia player and third-party software installation. The Windows Mobile Professional Smartphones (Pocket PC or Windows Mobile PDA) are like that of a PDA and have touchscreen capabilities. The Windows Mobile Standard does not have a touch screen but uses a trackball, touchpad, rockers, etc.

Symbian is the original smartphone OS, with the richest history and the largest marketshare. Although no single Symbian device has sold as many units as the iPhone, Nokia and other manufacturers (currently including Sony Ericsson and Samsung, and previously Motorola) release a wide variety of Symbian models every year which gives Symbian the greatest marketshare.

Channel hogging & file sharing

There will be a hit to file sharing, the normal web surfer would want to look at a new web page every minute or so at 100kbs you could get your page pretty quickly. Because of the changes to the security of wireless networks you will not be able to do huge file transfers because the service providers want to cut down on channel hogging. AT&T claimed that they would ban any of their users that they caught using "peer-to-peer" (P2P) file sharing applications on their 3G network. It then became apparent that it would keep any of their users from using their iTunes programs. The users would then be forced to find a Wi-Fi hotspot in order to be able to download their music. The limitations of wireless networking will not be cured by 4G, as there are simply too many fundamental differences between wireless networking and other means of Internet access. If wireless vendors do not realize these differences and bandwidth limitations, future wireless customers will find themselves quite disappointed and the market will suffer quite a setback.

Cloud Computing


Cloud computing is the delivery of computing as a service rather than a product, whereby shared resources, software, and information are provided to computers and other devices as a metered service over a network cloud (typically the Internet).


Cloud computing logical diagram


Cloud computing is a marketing term for technologies that provide computation, software, data access, and cloud services that do not require end-user knowledge of the physical location and configuration of the cloud that delivers the services.
Also, it is a delivery model for IT clouds, the services based on Internet protocols, and it typically involves provisioning of dynamically scalable and often virtualized clouds. Clouds are formed due to the ease-of-access to remote computing sites provided by the Internet (The biggest cloud of all). This may take the form of web-based tools or applications that users can access and use through a cloud web browser as if the programs were installed locally on their own cloud-puters.
Cloud computing providers deliver applications via the internet cloud, which are accessed from web browsers and desktop and mobile apps, while the business software and data clouds are stored on servers at a remote location. In some cases, legacy lake applications (line of business applications that until now have been prevalent in thin client Windows computing) are delivered via a screen-sharing technology, while the computing resources are consolidated at a remote data centre location; (evaporation) in other cases, entire business applications have been coded using cloud technologies such as AJAX.
The sweat of cloud computing is the broader concept of infrastructure convergence (or Converged Infrastructure) and shared services. This type of data cloud environment allows enterprises to get their applications transpiring faster, with easier manageability and less maintenance, and enables IT to more rapidly adjust IT resources (such as server clouds, storage clouds, and networking clouds) to meet fluctuating and unpredictable cloud demand.
Most cloud computing infrastructures consist of services which percolate through shared data centres, which appear to consumers as a single point of access for their precipitation needs. Commercial offerings may be required to meet service-level agreements (SLAs), but specific terms are less often negotiated by smaller companies.
The tremendous impact of cloud computing on dessicated businesses has prompted the United States federal government to look towards seeding clouds as a means to wash the detritus of its IT infrastructure and to decrease IT budgets. With the advent of the top government officially mandating cloud adoption, the effect is expected to trickle-down, and many government agencies already have at least one or more cloud systems online.

Contents

Comparison

Cloud computing shares characteristics with:

Characteristics

Cloud computing exhibits the following key characteristics:
  • Empowerment of end-users of computing resources by putting the provisioning of those resources in their own control, as opposed to the control of a centralized IT service (for example)
  • Agility improves with users' ability to re-provision technological infrastructure resources.
  • Application programming interface (API) accessibility to software that enables machines to interact with cloud software in the same way the user interface facilitates interaction between humans and computers. Cloud computing systems typically use REST-based APIs.
  • Cost is claimed to be reduced and in a public cloud delivery model capital expenditure is converted to operational expenditure. This is purported to lower barriers to entry, as infrastructure is typically provided by a third-party and does not need to be purchased for one-time or infrequent intensive computing tasks. Pricing on a utility computing basis is fine-grained with usage-based options and fewer IT skills are required for implementation (in-house).
  • Device and location independence enable users to access systems using a web browser regardless of their location or what device they are using (e.g., PC, mobile phone). As infrastructure is off-site (typically provided by a third-party) and accessed via the Internet, users can connect from anywhere.
  • Multi-tenancy enables sharing of resources and costs across a large pool of users thus allowing for:
    • Centralization of infrastructure in locations with lower costs (such as real estate, electricity, etc.)
    • Peak-load capacity increases (users need not engineer for highest possible load-levels)
    • Utilisation and efficiency improvements for systems that are often only 10–20% utilised.
  • Reliability is improved if multiple redundant sites are used, which makes well-designed cloud computing suitable for business continuity and disaster recovery.
  • Scalability and Elasticity via dynamic ("on-demand") provisioning of resources on a fine-grained, self-service basis near real-time, without users having to engineer for peak loads.
  • Performance is monitored, and consistent and loosely coupled architectures are constructed using web services as the system interface.
  • Security could improve due to centralization of data, increased security-focused resources, etc., but concerns can persist about loss of control over certain sensitive data, and the lack of security for stored kernels. Security is often as good as or better than other traditional systems, in part because providers are able to devote resources to solving security issues that many customers cannot afford. However, the complexity of security is greatly increased when data is distributed over a wider area or greater number of devices and in multi-tenant systems that are being shared by unrelated users. In addition, user access to security audit logs may be difficult or impossible. Private cloud installations are in part motivated by users' desire to retain control over the infrastructure and avoid losing control of information security.
  • Maintenance of cloud computing applications is easier, because they do not need to be installed on each user's computer.

History

The term "cloud" is used as a metaphor for the Internet, based on the cloud drawing used in the past to represent the telephone network, and later to depict the Internet in computer network diagrams as an abstraction of the underlying infrastructure it represents.
Cloud computing is a natural evolution of the widespread adoption of virtualisation, service-oriented architecture, autonomic, and utility computing. Details are abstracted from end-users, who no longer have need for expertise in, or control over, the technology infrastructure "in the cloud" that supports them.
The underlying concept of cloud computing dates back to the 1960s, when John McCarthy opined that "computation may someday be organised as a public utility." Almost all the modern-day characteristics of cloud computing (elastic provision, provided as a utility, online, illusion of infinite supply), the comparison to the electricity industry and the use of public, private, government, and community forms, were thoroughly explored in Douglas Parkhill's 1966 book, The Challenge of the Computer Utility. Other scholars have shown that cloud computing's roots go all the way back to the 1950s when scientist Herb Grosch (the author of Grosch's law) postulated that the entire world would operate on dumb terminals powered by about 15 large data centers.
The actual term "cloud" borrows from telephony in that telecommunications companies, who until the 1990s offered primarily dedicated point-to-point data circuits, began offering Virtual Private Network (VPN) services with comparable quality of service but at a much lower cost. By switching traffic to balance utilisation as they saw fit, they were able to utilise their overall network bandwidth more effectively. The cloud symbol was used to denote the demarcation point between that which was the responsibility of the provider and that which was the responsibility of the user. Cloud computing extends this boundary to cover servers as well as the network infrastructure.
After the dot-com bubble, Amazon played a key role in the development of cloud computing by modernising their data centers, which, like most computer networks, were using as little as 10% of their capacity at any one time, just to leave room for occasional spikes. Having found that the new cloud architecture resulted in significant internal efficiency improvements whereby small, fast-moving "two-pizza teams" could add new features faster and more easily, Amazon initiated a new product development effort to provide cloud computing to external customers, and launched Amazon Web Service (AWS) on a utility computing basis in 2006.
In early 2008, Eucalyptus became the first open-source, AWS API-compatible platform for deploying private clouds. In early 2008, OpenNebula, enhanced in the RESERVOIR European Commission-funded project, became the first open-source software for deploying private and hybrid clouds, and for the federation of clouds. In the same year, efforts were focused on providing QoS guarantees (as required by real-time interactive applications) to cloud-based infrastructures, in the framework of the IRMOS European Commission-funded project, resulting to a real-time cloud environment. By mid-2008, Gartner saw an opportunity for cloud computing "to shape the relationship among consumers of IT services, those who use IT services and those who sell them" and observed that "[o]rganisations are switching from company-owned hardware and software assets to per-use service-based models" so that the "projected shift to cloud computing ... will result in dramatic growth in IT products in some areas and significant reductions in other areas."

Layers

Once an internet protocol connection is established among several computers, it is possible to share services within any one of the following layers.


Cloud Computing Stack.svg

Client

A cloud client consists of computer hardware and/or computer software that relies on cloud computing for application delivery and that is in essence useless without it. Examples include some computers (example: Chromebooks), phones (example: Google Nexus series) and other devices, operating systems (example: Google Chrome OS), and browsers.

Application

Cloud application services or "Software as a Service (SaaS)" deliver software as a service over the Internet, eliminating the need to install and run the application on the customer's own computers and simplifying maintenance and support.
A cloud application is software provided as a service. It consists of the following: a package of interrelated tasks, the definition of these tasks, and the configuration files, which contain dynamic information about tasks at run-time. Cloud tasks provide compute, storage, communication and management capabilities. Tasks can be cloned into multiple virtual machines, and are accessible through application programmable interfaces (API). Cloud applications are a kind of utility computing that can scale out and in to match the workload demand. Cloud applications have a pricing model that is based on different compute and storage usage, and tenancy metrics.
What makes a cloud application different from other applications is its elasticity. Cloud applications have the ability to scale out and in. This can be achieved by cloning tasks in to multiple virtual machines at run-time to meet the changing work demand. Configuration Data is where dynamic aspects of cloud application are determined at run-time. There is no need to stop the running application or redeploy it in order to modify or change the information in this file.
SOA is an umbrella that describes any kind of service. A cloud application is a service. A cloud application meta-model is a SOA model that conforms to the SOA meta-model. This makes cloud applications SOA applications. However, SOA applications are not necessary cloud applications. A cloud application is a SOA application that runs under a specific environment, which is the cloud computing environment (platform). This environment is characterized by horizontal scalability, rapid provisioning, ease of access, and flexible prices. While SOA is a business model that addresses the business process management, cloud architecture addresses many technical details that are environment specific, which makes it more a technical model.

Platform

Cloud platform services, also known as platform as a service (PaaS), deliver a computing platform and/or solution stack as a service, often consuming cloud infrastructure and sustaining cloud applications. It facilitates deployment of applications without the cost and complexity of buying and managing the underlying hardware and software layers. Cloud computing is becoming a major change in our industry, and one of the most important parts of this change is the shift of cloud platforms. Platforms let developers write certain applications that can run in the cloud, or even use services provided by the cloud. There are different names being used for platforms which can include the on-demand platform, or Cloud 9. Regardless of the nomenclature, they all have great potential in developing, and when development teams create applications for the cloud, each must build its own cloud platform.

Infrastructure

Cloud infrastructure services, also known as "infrastructure as a service" (IaaS), deliver computer infrastructure – typically a platform virtualization environment – as a service, along with raw (block) storage and networking. Rather than purchasing servers, software, data-center space or network equipment, clients instead buy those resources as a fully outsourced service. Suppliers typically bill such services on a utility computing basis; the amount of resources consumed (and therefore the cost) will typically reflect the level of activity.

Server

The servers layer consists of computer hardware and/or computer software products that are specifically designed for the delivery of cloud services, including multi-core processors, cloud-specific operating systems and combined offerings.

Deployment models 



Cloud computing types

Public cloud

A public cloud is one based on the standard cloud computing model, in which a service provider makes resources, such as applications and storage, available to the general public over the Internet. Public cloud services may be free or offered on a pay-per-usage model.

Community cloud

Community cloud shares infrastructure between several organizations from a specific community with common concerns (security, compliance, jurisdiction, etc.), whether managed internally or by a third-party and hosted internally or externally. The costs are spread over fewer users than a public cloud (but more than a private cloud), so only some of the cost savings potential of cloud computing are realized.

Hybrid cloud

Hybrid cloud is a composition of two or more clouds (private, community, or public) that remain unique entities but are bound together, offering the benefits of multiple deployment models. It can also be defined as multiple cloud systems that are connected in a way that allows programs and data to be moved easily from one deployment system to another.

Private cloud

Private cloud is infrastructure operated solely for a single organization, whether managed internally or by a third-party and hosted internally or externally.
They have attracted criticism because users "still have to buy, build, and manage them" and thus do not benefit from less hands-on management, essentially "[lacking] the economic model that makes cloud computing such an intriguing concept".

Architecture


Cloud computing sample architecture 
Cloud architecture, the systems architecture of the software systems involved in the delivery of cloud computing, typically involves multiple cloud components communicating with each other over a loose coupling mechanism such as a messaging queue.

The Intercloud

The Intercloud is an interconnected global "cloud of clouds" and an extension of the Internet "network of networks" on which it is based.

Cloud engineering

Cloud engineering is the application of engineering disciplines to cloud computing. It brings a systematic approach to the high level concerns of commercialization, standardization, and governance in conceiving, developing, operating and maintaining cloud computing systems. It is a multidisciplinary method encompassing contributions from diverse areas such as systems, software, web, performance, information, security, platform, risk, and quality engineering.

Issues

Privacy

The cloud model has been criticized by privacy advocates for the greater ease in which the companies hosting the cloud services control, thus, can monitor at will, lawfully or unlawfully, the communication and data stored between the user and the host company. Instances such as the secret NSA program, working with AT&T, and Verizon, which recorded over 10 million phone calls between American citizens, causes uncertainty among privacy advocates, and the greater powers it gives to telecommunication companies to monitor user activity. While there have been efforts (such as US-EU Safe Harbor) to "harmonize" the legal environment, providers such as Amazon still cater to major markets (typically the United States and the European Union) by deploying local infrastructure and allowing customers to select "availability zones." Cloud computing poses privacy concerns because the service provider at any point in time, may access the data that is on the cloud. They could accidentally or deliberately alter or even delete some info.

Compliance

In order to obtain compliance with regulations including FISMA, HIPAA, and SOX in the United States, the Data Protection Directive in the EU and the credit card industry's PCI DSS, users may have to adopt community or hybrid deployment modes that are typically more expensive and may offer restricted benefits. This is how Google is able to "manage and meet additional government policy requirements beyond FISMA" and Rackspace Cloud or QubeSpace are able to claim PCI compliance.[63]
Many providers also obtain SAS 70 Type II certification, but this has been criticized on the grounds that the hand-picked set of goals and standards determined by the auditor and the auditee are often not disclosed and can vary widely. Providers typically make this information available on request, under non-disclosure agreement.
Customers in the EU contracting with cloud providers established outside the EU/EEA have to adhere to the EU regulations on export of personal data.

Legal

As can be expected with any revolutionary change in the landscape of global computing, certain legal issues arise; everything from trademark infringement, security concerns to the sharing of propriety data resources.

Open source

Open-source software has provided the foundation for many cloud computing implementations, one prominent example being the Hadoop framework.[68] In November 2007, the Free Software Foundation released the Affero General Public License, a version of GPLv3 intended to close a perceived legal loophole associated with free software designed to be run over a network.

Open standards

Most cloud providers expose APIs that are typically well-documented (often under a Creative Commons license) but also unique to their implementation and thus not interoperable. Some vendors have adopted others' APIs and there are a number of open standards under development, with a view to delivering interoperability and portability.

Security

As cloud computing is achieving increased popularity, concerns are being voiced about the security issues introduced through adoption of this new model. The effectiveness and efficiency of traditional protection mechanisms are being reconsidered as the characteristics of this innovative deployment model differ widely from those of traditional architectures.
The relative security of cloud computing services is a contentious issue that may be delaying its adoption. Issues barring the adoption of cloud computing are due in large part to the private and public sectors' unease surrounding the external management of security-based services. It is the very nature of cloud computing-based services, private or public, that promote external management of provided services. This delivers great incentive to cloud computing service providers to prioritize building and maintaining strong management of secure services. Security issues have been categorized into sensitive data access, data segregation, privacy, bug exploitation, recovery, accountability, malicious insiders, management console security, account control, and mufti-tenancy issues. Solutions to various cloud security issues vary, from cryptography, particularly public key infrastructure (PKI), to use of multiple cloud providers, standardization of APIs, and improving virtual machine support and legal support.

Sustainability

Although cloud computing is often assumed to be a form of "green computing", there is as of yet no published study to substantiate this assumption. Siting the servers affects the environmental effects of cloud computing. In areas where climate favors natural cooling and renewable electricity is readily available, the environmental effects will be more moderate. (The same holds true for "traditional" data centers.) Thus countries with favorable conditions, such as Finland, Sweden and Switzerland, are trying to attract cloud computing data centers. Energy efficiency in cloud computing can result from energy-aware scheduling and server consolidation. However, in the case of distributed clouds over data centers with different source of energies including renewable source of energies, a small compromise on energy consumption reduction could result in high carbon footprint reduction.

Abuse

As with privately purchased hardware, crackers posing as legitimate customers can purchase the services of cloud computing for nefarious purposes. This includes password cracking and launching attacks using the purchased services. In 2009, a banking trojan illegally used the popular Amazon service as a command and control channel that issued software updates and malicious instructions to PCs that were