Small Businesses Going to the Cloud: Three Top Considerations

Used with the permission of

A look at the issues small companies need to make sure they address before jumping into cloud computing.

Small businesses, of course, can save a ton of money and gain a lot of efficiencies by going to the cloud. But getting there isn’t necessarily that simple. Fact is, one size does not fit all. “A startup marketing company, for example, may take a very different path from an established medical practice,” says Igal Rabinovich, CEO of IT Help Central, a White Plains, NY consulting firm. Here are some key considerations to take into account before making the move.

Create a migration plan.

Best is not to make the change willy-nilly, particularly if you think you’ll be moving many applications to the cloud. That means having a roadmap for how you’ll proceed, introducing applications one at a time and testing each one before deciding to go ahead with it and then moving onto the next. You also need to include a training period for employees to learn how to use each application.

The length and complexity of your plan, of course, depends on the number of applications you have, the size of your business and how distributed your workforce is, according to Ron Braatz, president of LiftOff Learning, an IT consulting firm. Introducing, say, an e-mail system to a highly distributed workforce would take longer than it would for a company where everyone works in the same office.

A plan can do more than help your move to the cloud go smoothly, however. It can also provide a larger strategic boost. Jill Billhorn, vice president, small business at CDW, a Vernon Hills, Ill., IT consulting firm recalls a fast-growing client, an exercise business that was opening up locations at a rapid pace. At first, the approach was to launch new venues and bring IT staff in on the plan only shortly before opening. “It ended up that IT had to spend much of their time putting out fires as a result,” says Billhorn. Eventually, the IT group decided to start scrutinizing the expansion plan for the following year and form a blueprint for introducing appropriate applications. As a result, as the business grew, they were able to operate more judiciously and effectively, and that helped overall expansion, according to Billhorn.

Using a plan also puts you ahead of the pack. Only 35% of small businesses have developed a written strategic roadmap for the adoption of cloud computing, according to a survey, recently conducted by CDW.

Think about reliability.

Whatever you’re using the cloud for, chances are it’s important to the functioning of your business. So you want to make sure you have access you can rely on. Take Roper DeGarmo, president of Signature Personal Insurance, an insurance brokerage in Mission, Mo., who started using cloud applications eight years ago and now employs everything from e-mail to client data storage systems. According to DeGarmo, who, until recently ran his business from home, his cable connection worked well until later on in the day when more people started using the Internet after returning from work. He ended up adding a DSL connection for Internet access at those times. “Having a fast connection is obviously great, but if the connection has stability problems it can wreak havoc with file uploads and online services,” says DeGarmo.

You also need to make sure your service providers have adequate backup precautions. For example, if you’re using a phone system, make sure the service automatically will be rerouted to another telephone line if the servers are down. “Always ask the question, what happens if you go down, how will it impact me,” says Rabinovich.

Rabinovich, in fact, suggests small businesses think twice before putting certain mission-critical functions in the cloud. ” I always ask clients, if the capability is down for a couple of hours or couple of days, what will that mean for your business,” he says. “If the answer is, you won’t be able to function, you might not move that application to the cloud.”

Look at the legal issues.

For starters, scrutinize the fine print. Example: A cloud provider may waive liability in case of lost data. Depending on your industry, you also may need to make sure you’re compliant with regulations governing data. If, say, you operate in Europe or have European customers, you’ll need to consider the EU’s Data Protection Directive, which regulates the processing of personal data, according to Keith Broyles, a partner and specialist in intellectual property at Alston & Bird, a law firm in Atlanta. You also need to be aware of where your data will be hosted. The reason: If it will be on a server outside of the U.S. and there’s a problem, depending on your contractual provisions, you could wind up ” not getting the benefit of U.S. laws,” says Broyles.

Then there’s the matter of your exit strategy. “You want to be mindful that there’s going to become a point when the relationship between you and your cloud vendor ends,” says Todd McClelland, who also is a partner at Alston & Bird. For that reason, you negotiate your exit strategy upfront, rather than dealing with it when you’re about to pull the plug.

The upshot: going to the cloud has many benefits. For best success, however, you need to arm yourself with as much information as possible before jumping in.


Get smart about security

used with permission from HP Technology at Work

Congratulations, you’ve taken every step to secure data on your networks and PCs against increasingly malicious worms, Trojans and viruses. But don’t rest easy. All infrastructure elements, including printers, servers, storage, Wi-Fi networks and cloud computing are just as susceptible to surprising security threats. Forget them and your sense of security is nothing but a dream.

Whether they’re criminals looking to blackmail your business, technically savvy vandals getting their kicks, revenge-minded former employees or even competitors, hackers all have one thing in common: they want to disrupt your business operations for money, other gain—or simply for fun.

So, what can you do? Read on for some valuable tips to bolster your overall IT defense. Combined with regular and diligent employee training and education, these pointers can help you better spot and prevent disruptive security attacks.

Mobile dos and don’ts

More than large companies, small businesses are issuing or implementing bring-your-own device (BYOD) policies regarding smartphones, tablets and other mobile devices. The ubiquitous nature of such products can cause companies to assume that their business information safely resides on them. Wrong.

Your IT department is responsible for protecting company data, regardless of where it’s housed. What to do? For one thing, businesses must set firm policies about what data are allowed on employee-owned devices. It’s also wise to weigh the relative safety of available smartphone operating systems and perhaps require data to be stored on an approved server or in the cloud.

Safe and secure storage

Servers and storage devices also present a unique set of security challenges. Denial-of-service (DoS) attacks, for example, can overload those running web applications and compromise network bandwidth, memory, CPU use and hard-disk space. Solutions like the HP ProLiant G8 servers deliver comprehensive data and client protection and security.

Working without wires

Wi-Fi networks aren’t immune from sabotage-minded attackers, either. Consider these dangers:

  • Weak personal identification numbers (PINs) allow the ability of any user to access any wireless network at will. A laptop-equipped troublemaker sitting in your parking lot might be able to hack into your important data this way.
  • Security gaps allow wireless users to snoop on each other’s networks.
  • Operating system flaws provide easy backdoor access to a single computer—or even up to an entire network.

Easy first steps to securing your network include simplifying network management, implementing clearly defined BYOD security policies and making rogue Wi-Fi access more difficult with services like HP TippingPoint networking security solutions.

Consider the cloud

True, the cloud improves server, storage and network access and is less expensive than physical systems. But with easy data-access comes serious confidentiality concerns. Careful monitoring, strict access control and encrypted data are among the best security measures, along with the use of a private, rather than a public enterprise cloud.

IT infrastructure aside, simple password security is surprisingly often overlooked in developing an overall security plan. Increased password complexity, and the use of single sign-on and other technologies, is essential.

Staying one step ahead of cyber criminals demands detailed development of security policies and processes. Proactive businesses that develop comprehensive security plans better ensure their own safety, integrity, reputations and bottom-line profitability.


Data growing pains?

used with permission from HP Technology at Work

Virtualization. Like other technology buzzwords, some users work this term into business conversations without really understanding its meaning or how its strategic application can streamline operational efficiencies, improve resource allocation, enhance network security and reduce costs.

It’s worth learning. Careful evaluation of existing non-virtualized environments is the most vital first step toward choosing the best virtual server and storage solutions for any given environment. This evaluation should be done with an eye on present and anticipated computing and power requirements, as well as the number of existing and future users.

Growing data storage requirements are always a major concern of large corporations and institutions. But “big data” has become an issue for small businesses, too. Varying operating systems, a growing number of applications and the increased use of mobile, BYOD and other technologies threaten to overwhelm existing physical server and storage solution capacities.

Rather than allocating resources toward upgrading aging servers or buying new ones—the ‘ol “throwing good money after bad”—more IT and other administrators see the benefits of “going virtual.” Indeed, Acronis’ Global Disaster Recovery Index found that 21 percent of surveyed small businesses planned to adopt virtualization last year, a number most likely to increase in 2013.

Additional virtualization benefits include enhanced network performance, lower maintenance costs, streamlined and centralized management capabilities, improved disaster recovery, and the flexibility to easily accommodate additional users and applications. The buzz surrounding virtualization is well deserved. But what does that aforementioned network evaluation consist of? How do you get from Point A (physical storage environment) to Point B (virtualization)?

Ask yourself the most pertinent questions:

  • How many physical servers do you have? What functions do they perform? How many do you need?
  • How many users do they serve? Are you experiencing any issues with your current servers? Are you looking to streamline any business processes?
  • What percentage of your resources is underutilized? By how much?
  • What are your present and anticipated storage requirements? How much of your existing infrastructure can you virtualize?

As server hardware and storage solutions become increasingly clogged with users accessing a growing number of applications to perform business processes, system responsiveness can lag on various days at different times. Asking these questions while conducting component inventory and performance metrics helps determine the amount of virtualization needed.

Virtualization improves application and process access through pooling, sharing and clustering on an as-needed basis. It also reduces the need for physical solutions and their related operational and ownership costs.

Generally speaking, the best candidates for the virtualization of hardware and storage solutions are older servers requiring frequent upgrade costs, infrequently used servers, and multiprocessor servers dedicated to single-processor applications. Applications such as those in a development or test environment, those using a single processor and those with low use rates/frequent idle times are best offloaded onto virtual storage solutions.

Ever-increasing storage requirements, irrespective of business or industry, call for migration to a virtualized infrastructure. Massive file sharing, increasingly sophisticated applications and the ever present danger of costly downtime from technician mistakes or cyber attacks further underscore the need.

HP’s Converged Infrastructure systems bolster network performance, decrease maintenance and save money. These systems comprise a wide variety of server and storage solutions in addition to delivering the virtual bandwidth required to handle massive amounts of data. HP ProLiant servers, running VMware and Microsoft® Hyper-V® virtualization software, help optimize performance, simplify management, speed deployment and reduce risk.

HP Converged Storage virtual solutions bolster ROI by eliminating physical, logical and management boundaries, leveraging such technologies as deduplication, compression, metadata search and object APIs for cloud applications.

Similarly, HP Storage for Server and Client Virtualization utilizes scale-out designs with clustered architectures for optimal performance under unpredictable mixed and heavy VM workloads. Hardware-assisted thinning converts legacy storage and cuts capacity requirements by 50 percent, while enabling the deployment of new VMs in seconds. These innovations are able to cut management overhead by as much as 90 percent.

Simply stated, before the introduction of virtualization technologies, businesses had to operate separate servers for incompatible, platform-specific applications. The result? Massive hardware investments and maintenance costs. In a virtualized environment, a single server can run multiple operating systems while supporting a variety of business applications. The question is no longer whether to virtualize, but rather when.


Nine Lives Media Names Network Management Group, Inc. (NMGI) to the MSPmentor 501 Global Edition

Sixth-Annual Report, Formerly the MSPmentor 100, Lists The World’s Top 501 Managed Service Providers (MSPs)

HUTCHINSON, KANSAS — February 19, 2013 — NMGI has landed on Nine Lives Media’s sixth-annual MSPmentor 501 Global Edition (, a distinguished list and report identifying the world’s top 501 managed service providers (MSPs). This year’s report has been expanded extensively to include:

  • New: MSPmentor 501 Global Edition
  • New: MSPmentor 100 Small Business Edition (top MSPs with 10 or fewer employees)
  • MSPmentor 200 North America Edition
  • MSPmentor 50 EMEA (Europe, Middle East, Africa) Edition
  • MSPmentor 25 AANZ (Asia, Australia, New Zealand) Edition
  • New: In-depth data tracking mobile device management (MDM), managed cloud services and other recurring revenue opportunities for MSPs.

“This prestigious award further validates the strides NMGI is making in providing total Managed Services to our business clients from Boston to Honolulu. We are honored to have been selected to both the Global 501 and Top 200 MSP’s in North America” said Steve Harper – Chairman and CEO.

The MSPmentor 501 report is based on data from MSPmentor’s global online survey conducted October-December 2012. The MSPmentor 501 report recognizes top managed service providers based on a range of metrics, including annual managed services revenue growth, revenue per employee, managed services offered and customer devices managed.

“MSPmentor congratulates Network Management Group, Inc. on this honor,” said Amy Katz, president of Nine Lives Media, a division of Penton Media. “Qualifying for our MSPmentor 501 Global Edition puts NMGI in rare company.”

MSPs on this year’s global 501 list lifted their combined annual recurring revenues 24.5 percent to $2.54 billion. Together, those MSPs now manage more than 5.6 million PCs and servers, and nearly 400,000 smartphones and tablets, according to Joe Panettieri, editorial director, Nine Lives Media.

MSPmentor, produced by Nine Lives Media, is the ultimate guide to managed services. MSPmentor features the industry’s top-ranked blog, research, Channel Expert Hour Webcasts and FastChat videos. It is the number one online media destination for managed service providers in the world.

About Network Management Group, Inc.
NMGI is a national provider of consultative services with an emphasis on computer networking, business continuity, and technology services for small and midsize businesses and organizations located throughout the United States. Network Management Group, Inc. designs, implements, and manages business technology solutions for our clients. Founded in 1984, NMGI is headquartered in Hutchinson, Kansas.

For more information Contact:

Tom Hammersmith
Marketing Coordinator
Network Management Group, Inc.
(620) 664-6000 x132

For more information on Network Management Group, Inc.:

About Nine Lives Media
Nine Lives Media, a division of Penton Media (, defines emerging IT media markets and disrupts established IT media markets. The company’s IT channel-centric online communities include MSPmentor (, The VAR Guy ( and Talkin’ Cloud (

Nine Lives Media, a division of Penton Media
Joe Panettieri, Editorial Director
212-204-4206 or


Windows 8: Is it right for you?

Sanchez Williams, Systems Engineer – Network Management Group, Inc.

Ever since Windows 8 was released in October of last year I have been asked several times when I believe companies should consider upgrading to Windows 8.  Is it faster, better, and most importantly will it increase production are the most common questions I receive.

Let’s start off with the most important one; will it increase production out of my current staff?  Like all things technology this is best answered with “it depends.”  For most users my experience with the product would lead me to answer no.  In fact I would expect production to reduce and user frustration to increase substantially for the first few months of use while users get used to the new Metro Interface.  The Metro Interface completely changes how users access their applications by placing a series of tiles on the main screen in lieu of a Start Button.  The Metro Interface can be extremely frustrating and downright confusing to use.  Even after forcing myself to use Windows 8 for several weeks I still didn’t have navigating through the Metro Interface down and would commonly bypass it to get
my work done more quickly. Many people wonder why Microsoft switched to this interface and the short answer is they wanted the same Interface/feel through all devices (ie. Smartphone, tablet, laptop, PC) so that users can seamlessly move between them.  There is an obvious emphasis on mobile devices which is what makes using it as an everyday workstation so awkward.

The flip side to this design is that it works incredibly well with touchscreen devices, and not just tablets and smart phones.  If you work in a factory that uses touch screens instead of a mouse and keyboard Windows 8 is spectacular.  It is easy to navigate the Metro Interface with a touchscreen device and is quick and responsive.  Internet Explorer is available in the Metro Interface as an “application” instead of just a web browser, making it easier to use and overall better looking when using a touchscreen.

So did these major changes translate into a faster experience?  Boot times are noticeably increased along with login times.  During my testing it took about half the time to get from CTRL+ATL+DEL to a usable Desktop than it did in Windows 7.  As far as the experience once logged in there wasn’t really a notable difference in speed or performance.

This brings us to the final question and a great way to conclude, is it better?  If you are a company that heavily uses touchscreens throughout your company that I would give it a solid yes.  Windows 8 was clearly designed for use on a touch screen and both the feel and appearance confirm this.  However, if you are using it for everyday office use I can’t say I would recommend it just yet, at least not on a large scale.  I would get a test workstation or a Virtual Machine setup for users to play with and get comfortable before expecting solid production out of them.


Hey, What’s Going on Here?

Randy Johnston, Vice President – Network Management Group, Inc.

2013 promises to be a big year from a computer hardware perspective with new Ultrabooks, tablets and phones, but the real news for 2013 will be in software. The basic building blocks of software, the operating systems, have been quietly going through a metamorphosis during the past few years. We will see the results of these changes positively affect our working style and ability this year and beyond. Operating systems are converging into what are called platforms, and the platform is now important as we will see later. Some of the operating system changes made by Microsoft and others were motivated by remote connectivity, portability and the cloud and some were motivated by trying to simplify the way we work. Applications have been swept along this sea change…perhaps it should be called a rip tide.

Since 2013 is a year of radical technology change, we have to plan our strategies. Successful strategies for your business might include: 1) strategic vision, 2) client focus, 3) working with your team, 4) simplification, and 5) technology. Choosing the right strategy and tools to service your market and clients the best way you can is a winning approach.

The Big Shifts

The vision is simple: 1) hardware is changing, 2) the operating systems that support these systems are changing into platforms, 3) the applications have to change to support mobility, web and ease of use, and 4) the backbone and infrastructure that supports all of our computing is changing, including virtualization, backup, private and public clouds, SaaS, and hosting.

First, let’s consider some background issues. We believe that brand name computers will generally have a lower cost of ownership over the life of owning the product. White box clones may be cheaper to purchase initially, but operational and compatibility issues can eat up any potential savings rather quickly. Second, there is a notable revolution in progress for the size and speed of end-user computer hardware and phablets (=phones/tablets) in the market leading to the bring your own device (BYOD) revolution. Third, we believe that the system software that runs these devices is converging and your choice of platform determines many of your options or choices. Most businesses have standardized on Microsoft Windows in the last two decades. We see three main platforms evolving currently: Windows 8/Microsoft, iOS/Apple, and Android/Google. Fourth, access to software through hosting or using Software as a Service (SaaS) is leading some businesses to a simpler configuration of computers in house. Some refer to this approach as the public cloud. In other cases, the systems requirements have become far more complex to the point that it is rare a single IT professional can install and maintain everything in a complete system. NMGI can very effectively run an entire IT operation from our offices with today’s remote support tools. The support approach used for in-house systems, often called managed services, allows a trusted and knowledgeable technician to maintain your system, often to the level of installing updates to applications such as accounting or your operating systems from anywhere. Because of these factors, you should pick an end-user computer hardware strategy that fits your needs. However, it is pretty clear that computer hardware platforms matter less today than they did five years ago.

Consider the impact of Ultrabooks and phablets and look at end-user computing hardware today:

Desktop Notebook Ultrabook Netbook Tablet
Speed Fastest Can be close to desktop speed Light, yet close to Notebook Low cost, slow Slowest
Size Largest Heaviest and largest portable Close to tablet Heavier than Ultrabook Smallest
Portability None Heaviest Close to tablet Between Notebook and Ultrabook Lightest
Cost $450-1000 $600-1900 $700-1100 $300-600 $150-1200

All of these computing tools can be used to run in the cloud. The netbook, tablet and smartphone do the worst job of running applications at high speed today, but they are very portable. The backbone is getting stronger and the applications are getting better to make these devices more usable. However, they are still best for consuming content, reading results, answering a few emails or taking notes, but are not very good for heads-down data entry.

But the big news?

The big news in technology is the seismic shift in operating system convergence to a single platform. For example, in 2013, it is pretty clear that technology platforms and operating systems will converge. Think: Windows 8, iOS/Mountain Lion, or Android Ice Cream Sandwich on phablets and computers. These three platforms are being designed so the same operating system, or one that looks and operates in a similar fashion, runs on your phone, tablet or computer. When you buy into a platform from a vendor, the way applications integrate and work together is largely controlled by the vendor. As you can guess, this is a fight between Microsoft, Apple and Google right now. Some of the fight is controlled by intellectual property, patents and lawsuits, some is controlled by innovation and ease of use and some is controlled by application availability. A future that allows applications to seamlessly run between a phone, tablet and computer could be very attractive as long as the application behaves appropriately on the different devices. Even more attractive would be a future that allowed applications to run on any platform and seamlessly work together.

For me personally, it has been a blessing to switch from one phone to another frequently including iPhones, Android phones and Windows Phones. Most of you don’t get the benefit of doing this because of your multi-year contracts. I usually carry a phone for 90 days or so to learn how it works, and then pass this product on to someone else. Today, I’m carrying a Windows Phone 8X by HTC, because I wanted to see how the Windows phone environment worked in conjunction with my Windows Surface Tablet and my Windows 8 computer. For sure, I say wow! While these technologies are far from perfect, the Windows 8 phone and the Windows Surface tablets may not quite be ready for the mass market, but I’m able to do key tasks quicker on my Windows phone than I could on my iPhone. Another big recent learning in this area was that the data shared between the Windows Phone, Windows Surface and Windows 8 is all a seamless experience. While I prefer running Windows 8 with a touch screen, it has also become apparent that with Microsoft or Logitech mice that have updated software to run with Windows 8, an upgrade to touch screen technology is not mandatory. Further the more sophisticated software like SetPoint for these mice actually makes Windows 8 more productive than Windows 7. The choice of platform has made my work processes easier. Even though the Apple iPhone was more elegant and simple, it is not easier than running Windows everywhere. As an additional point, with my Mac Air 2, iPad and iPhone, I did not have near the integration possible with Windows 8. As one last point, with my Google Chromebook, Google nexus7 and Motorola Droid phone, I did not have near the integration possible with Windows 8.

Platform limits choice while enabling ease of use. If we choose a particular vendor, we get the most benefits and the most restrictions by adhering to the vendor’s rules. Think of iCloud and iTunes as enabling your ability to shop easily and restricting your choice to what is in the Apple Store. Microsoft and Google are trying to mimic this model. Is a single supplier in your business’ best interest? Some say yes, while others say best of breed supports their strategy most effectively.

Some of our greatest frustrations come from hardware failures, inconsistent results or confusing design. Platform will minimize the differences between hardware run within the family. Each device will work in a similar fashion. Most of us could care less what the hardware or software is or whose brand is on it as long as it runs reliably 100% of the time and helps us get our job done. Platform will help us build our ideal future. Consider your platform choice and your providers carefully to get the best results for your business.


5 things small businesses forget when planning for a new or redesigned website

by Jeff Graber, Media Services Executive – Network Management Group, Inc.

There is a lot to think about when planning for a new website or a redesign of a current website. There’s marketing, design, branding, accessibility, and a ton of other things that go into a successful website. But there are some things that are frequently left out or forgotten in the planning process. I have put together a short list of some of the more critical things that are often excluded from the planning process:

1. Mobility

By now, the decision of investing in a mobile website is becoming very obvious. The use of mobile devices for web browsing doubled in 2012, while smartphones and tablets became even more popular and powerful. Some of the characteristics of a
n effective mobile website include fast loading time, navigation optimized for touchscreens, and an easy contact option such as a call button. Having a mobile website keeps mobile visitors engaged longer because information becomes easier to find and frustrations associated with loading delay and formatting go away.

2. Search Engine Visibility

Another important characteristic of your website is how easy it can be found in search engines like Google and Bing. If you are not taking specific measures to ensure visibility in search engines for certain keywords, then you will not be found by people searching for your products or services on the web. Some things to consider when optimizing your website for search engines include locations where you primarily do business and the products or services you provide.

3. Content Sources

Something a lot of businesses don’t think about when planning for their website is where the content will come from. Will employees write website content in-house? Will content come from marketing materials that have already been written? Will content creation be outsourced to professional copywriters? These questions should be answered during the planning process, not after a website is built or redesigned. And remember, the quality of your content is a reflection of your business. So make sure your content is easy to digest, well worded, and grammatically correct.

4. Content Management Solution

Once you have a plan for where you content is coming from, you will need a way to manage your content and get it to your website. This is where a content management system comes in. A content management system stores your content and allows your website to pull from and display it. With a content management system you can update content on your website without having any web development knowledge or experience. A complete content management system should include easy online access with a secure login, options for publishing now or in the future, and version control.

5. Backup & Recovery

Finally, the most critical consideration for your website is backup and recovery. If your website goes down you will need a way to obtain your data and get it back up quickly. Your website backup plan should fit into your business continuity plan. Some options to consider for a reliable backup solution are offsite backup, file backup as well as database backup, daily/nightly incremental backup, and restore procedures.

If you are planning for a new or redesigned website, NMGI is here to help with these and other website considerations. If you would like more information, give us a call at (620) 664-6000 or visit our website at



What is Network-Attached Storage?

You may find yourself hitting snags when it comes to maintaining and storing increasing amounts of data for your business. One solution to these problems is a network-attached storage device.

Network-attached storage (NAS – usually pronounced “naz”) is used to increase storage capacity on a network. A NAS device’s sole purpose is to store and provide access to files. It is a self-contained unit that typically consists of multiple storage drives in a RAID (redundant array of independent disks) configuration. Configuring a NAS device is simple and makes your data easy to manage.

Advantages of Network-Attached Storage

Efficiency. A network-attached storage device is a self-contained unit that runs an independent, specialized operating system and has its own IP address. Because of this, a NAS device does not rely on your server’s processing resources. This enhances the overall performance of your network.

Data Protection. A NAS device protects your data via a redundant array of independent disks (RAID), which replicates data across multiple drives. This configuration considerably minimizes the risk of data loss.

Data Sharing. Multiple users can access data on a NAS device, regardless of which devices and operating systems they use. This is especially useful if your business has multiple locations and requires centralized data storage. Users can access data with Windows Explorer in the same way as it would access any other drive. Users can also access data with a web browser.

Cost. You can host multiple terabytes of data on your network using NAS technology for far less than the cost of hosting data on a traditional fileserver. Additionally, the prices of quality NAS devices have fallen considerably in the past few years – making them an even more viable option for your data storage needs.

NAS increases the manageability of your data while decreasing the cost of hosting it. A NAS device is a reliable and versatile option for storing, sharing, and – most importantly – protecting your files.

We leverage this type of device as part of our NetRescue solution as a great replacement for high management, error-prone tape backups. Learn more about our NetRescue solution, and all of our offerings here!


7 Advantages of Managed IT Services

by Chase Moritz, Heartland Technology Solutions

Many small to mid-sized organizations are turning to Managed Service Providers (MSPs) to help alleviate some of the strain on their internal IT resources, or take over the management of their network all together.

There are quite a few benefits to partnering with an MSP, from improved reliability to a better understanding of assets, which are all valuable in their own right. These are a few that top the list for most organizations considering some form of Managed IT Services…

So, first of all, what is ‘Managed IT Services’?

A quick Wikipedia search turns up the definition: “Managed services is the practice of outsourcing day-to-day IT management responsibilities as a strategic method for improving operations.”

A managed services provider (MSP) is defined as “…typically an information technology (IT) services provider, who manages and assumes responsibility for providing a defined set of services to their clients either proactively or as they determine that the services are needed.”

  1. 24/7 Monitoring & Proactive Maintenance: As it stands right now, most businesses don’t have a way to monitor their network 24/7/365 and be notified immediately when a potential problem arises. This can be an invaluable feature because the earlier an issue is realized, the earlier it can be resolved. 24/7 monitoring allows for most issues to be realized and proactively resolved before they become major problems resulting in significant downtime.
  2. Budgeting: Planning for technology is difficult because things can change in a moment’s notice if a computer or server crashes. When that happens, budgets are busted on unplanned repair services. With a fixed monthly cost, Managed Services allow for businesses to better budget for their service costs as well as preparing for upgrades. Most MSPs have repair and service work already built into the contract, along with the ongoing maintenance already taking place.
  3. Comprehensive Reporting: Insight into the activity on your network is vital to determine how you will allocate your budget for the future and knowing what your employees are doing online. Without this reporting, you have no real way of knowing what issues have been resolved, where issues occur more frequently, or which areas of your network need to be shored up. Most MSPs can provide in-depth reporting on error messaging, problem remediation, and user activity and should offer at least a bi-annual review of these reports so that you understand what is happening on your network.
  4. Staff Availability: Whether your business has an internal IT staff or someone with split duties within your organization, you have a resource dedicated to handling the day-to-day maintenance and upkeep of your network.

    If that person is a dedicated IT person, they probably don’t have the time to do what needs to be accomplished because they are constantly putting out fires. If that person has split duties (IT and Accounting or Operations, for example) they don’t have time to fully focus on either of their duties in order to accomplish what their primary job entails.

    A basic Managed Service package allows your IT person to focus on what really needs to be done instead of the day-to-day maintenance and allows your split resource the ability to focus on their primary job, instead of worrying about the technology working properly.

  5. Efficiency: With the toolsets available to MSPs, issues are immediately reported and acted on and by not having to put out “fires” after they have caused noticeable issues or downtime, end users can have more streamlined and efficient experience. With proper planning and notification, patches and updates can be scheduled in a way that there is no disruption to the end user during working hours.
  6. Knowledge Base: When working with an outsourced technology company of any kind, one of the key benefits is being able to access their broad range of knowledge on more than one area of focus. They have an entire staff available to tackle just about any issue that may arise as well as advise or take on any project that needs to be implemented.  A good technology provider invests in its staff to ensure that their engineers have the most recent certifications for the services they offer and vendors they work with.
  7. Improved Security: Security is a common concern among most business leaders and having up to date security protecting your network is arguably the most important aspect of a business’s technology. Managed Service packages generally offer a solution to monitor your Firewall, Anti-Virus, and apply the latest updates and patches to ensure that your network is as secure as possible.

7 ways to make your PC last longer

used with permission from HP Technology at Work

A primary concern for most business owners is how to get the most bang for your buck. When you’re purchasing expensive technology, this becomes an even more valid point of consideration.

Buying computers is one of the larger investments you have to make in order to effectively run a business. To avoid surprise crashes and loss of data, it’s not recommended to hold onto a frequently used computer for more than four to five years. However, there are things you can do to help prolong its life span and enable it to perform better over time, saving you money in the long run.

Keep it clean

Dust, dirt, food and other particles tend to accumulate in the crevices of keyboards, mice and monitors. If not removed, these particles can scratch hardware components and eventually build up enough to cause overheating, shortening the life of your computer. To avoid this, make sure to dust and clean your computer and its accessories on a regular basis. Compressed air is a great way to get small particles out of keyboards and tight cracks. Read Cleaning Your Desktop PC for more detailed information on how to clean a desktop, some of which can also be applied to notebooks.

Keep it dry

PCs and liquids do not go together well. Never drink or rest water, coffee, soda or anythingliquid near a desktop or notebook. A spill could mean you’ll be buying a new one much sooner than you had planned. For a little extra insurance and protection, HP offers optional HP Accidental Damage Protection Care Packs.

Give it space

This tip applies mostly to notebooks. The nice thing about them is that they’re portable. On business trips, it can be tempting to set them down on a hotel pillow or bed while you’re casually answering emails or doing research. But soft, padded surfaces do not allow airflow into the ventilation holes underneath the notebook, which leads to overheating. To limit this risk, make sure you always rest your notebook on a cool, solid surface, allowing air to travel underneath it.

Protect it

Viruses could be the biggest threat to the health of a notebook or desktop. One of the very first things you should do when you buy a new computer is install anti-virus software. Some popular, effective applications include Microsoft Security Essentials and McAfee. Make sure you also take advantage of HP Protect Tools, a suite of security tools available on many HP PCs that lets you manage security for all of your business desktops and notebooks from one central point.

Give it more memory

Painfully slow processing is a sign that your computer may be starting to fade on you. Add extra RAM (random access memory) to relieve the strain of an overloaded machine. Once it stops relying on hard disk memory, your computer’s performance will become exponentially faster. Check out HP’s EasyBundle for an easy way to upgrade your notebook’s hard drive.

Keep it uncluttered

All of the programs that you don’t currently use on your computer are taking up valuable space. Getting rid of them will improve performance and save memory. Most PCs have a “disk cleanup” function that will delete “unseen” files and empty caches. You can also go through your files manually and remove anything you haven’t been accessing.

Choose it wisely

There are many things to consider when purchasing a computer. If you’re looking for a desktop or notebook that can stand the test of time and endure harsh environmental conditions, an HP Elite PC may be your answer. All Elite products must endure 115,000 hours of durability testing to prove they’ll be able to give you many years of reliable service.

Unfortunately, no matter how well you take care of it, the reality is that no computer can live forever. To decrease your risk of a catastrophic crash, make sure you don’t wait too long to buy a replacement. After several years, most computers start to display glitches and show signs that they could be struggling. To be on the safe side, you should plan on replacing your desktops and notebooks about every four years.