Reflections on Kaseya Connect 2017

Standard

As I’m sitting here in McCarran International Airport awaiting my jetBlue Red Eye back to Boston, I’ve been reflecting on my week here in Las Vegas at the Kaseya KaseyaConnectConnect conference.  Kaseya is one of several technology partners that we have at Internet & Telephone, LLC.  Specifically, we use the Kaseya Virtual System Administrator (VSA) IT management platform as well as AuthAnvil two-factor authentication.  Both are part of our stack of specialized tools that we use to manage our customer infrastructures.

This was my first time attending Kaseya Connect and I’m impressed with the company and their roadmap for the future.  What’s significant about this is that a few years ago, it looked like the company was moving in the wrong direction and was no longer going to be a good partner for us.  That is no longer the case, at all.

We started the week off in the Customer Success Council meeting on Monday.  During this invitation only meeting, Kaseya executives shared details on upcoming product developments and new initiatives, including recent and planned acquisitions.  Following this meeting, Kaseya hosted a focused security symposium.

During the symposium, some interesting statistics were shared from the 2017 Verizon Data Breach Investigation Report.  This report has become the annual standard bearer for the state of cybersecurity in the commercial market space.

Some highlights from the report:

  • 62% of breaches involved hacking.
  • 81% of hacks used stolen or weak passwords.
  • 51% of hacks used malware to steal passwords.
  • Over 1 billion credentials were stolen in 2016.
  • It is recommended to deploy two-factor authentication to all users when feasible.

When considering two-factor authentication, consider that it meets these requirements:

  • HIPAA for healthcare organizations.
  • FFIEC for small banks and credit unions.
  • CJIS for law enforcement agencies.
  • The latest revision of the legal professional code of conduct requires it for remote access.

Following are some updated stats about data breaches, in terms of impact:

  • Every record breached costs a company $158.00, on average.
  • The average number of records breached, per data breach, is 3,000.
  • This is an average cost of $475,000 per data breach.
  • Short term impacts of a data breach are downtime, lost data and business interruption.
  • Long term impacts of a data breach include damage to the company’s reputation, customer loss and lost revenue.

On Tuesday morning, Kaseya CEO Fred Voccola kicked off the event with an engaging keynote that shared interesting stats that you may read more about in my post from Wednesday titled Small and Medium Size Business Stats from Kaseya Connect.

Fred also provided a comprehensive review of what Kaseya calls it’s IT Complete stack.  This includes the core feature set of the VSA platform that we use to manage our customers as well as new or updated modules focusing on network management, identity and access management, backup and disaster recovery, Office 365 management and backup and an impressive Cloud management module that will allow us to help our customers save money on their Cloud subscription costs.

I was also intrigued by some new initiatives around data analytics to help us manage our business better and deeper integration with our customer documentation system.

Kaseya did a very good job outlining the product roadmap and how we will be able to leverage these developments to help our technical team better manage our customers.

We have built our security offerings around the National Institute for Standards and Technology (NIST) Cybersecurity Framework.  I was very excited to see that Kaseya has built their security offerings around this same framework.  This will make aligning our security strategy with what Kaseya is and will be delivering to its partners a compelling benefit for our customers.

There was also a very interesting session on improving the user experience with IT.  Using something called persona modeling, it’s an intriguing model of better understanding the needs of IT users based on their role in the organization from an individual, departmental and overall company mission point of view.  I’m looking forward to testing this out to see what opportunities for improvement it may bring to the surface.

The conference wrapped up on Thursday with the entire Kaseya executive team sharing their thoughts on where the industry is moving, based on their individual areas of responsibility.  This touched on all aspects of the solutions that Kaseya brings to its partners.  I was particularly pleased to gain some insight into the companies Internet of Things (IoT) strategy.  These are the myriad of devices that now have an IP address and connect to the network.  As these devices become more prevalent and important to a company’s success, it is very important that they be managed, like every other device on the network and right now, there is no consistent model for accomplishing this and organizations need to be careful about deploying unmanaged devices onto their corporate networks.  As we saw several times in 2016, these devices, left unprotected and unmanaged, can be taken over and used to carry out a data breach or attacks on other organization.

We also had the opportunity to have a private meeting with several members of the senior team to discuss our business plans and the status of our partnership.  I was very pleased with the level of transparency and candor from everyone at this meeting and I am looking forward to working more closely with everyone at Kaseya to deepen our partnership for the benefit of our customers and our two companies.

 

 

Why a Hybrid Cloud Strategy is Critical

Standard

I thought I would share a real world example of why hybrid Cloud is the right strategy for almost any business.  For years, backup and disaster recovery has been the buzz, but what if something accidental happens, that could knock one of your most important people offline for a day or more?  How would you deal with the interruption?  Would you be OK with one of your company’s key people being idle without warning?  Consider the following, which has happened to me over the last 24 hours.

Hybrid Cloud

Yesterday, Webroot, one of the leading anti-virus/anti-malware software companies inadvertently released an update that caused havoc with some of their customers.  To make a long and complicated story short, the updated flagged legitimate Windows operating system files as malware and quarantined or potentially deleted them.  Needless to say, this caused a lot of disruption for millions of users yesterday.  Webroot identified the issue within 15 minutes and immediately pulled the problem update.  While their response was rapid and appropriate, some users picked up the update, with catastrophic consequences in some instances.

Here is what I experienced yesterday.  I was working along and suddenly, I could no longer open a spreadsheet I was working on.  Within minutes, my PC, a Microsoft SurfaceBook running the latest pre-release Windows 10 update, started to crash.  Every reboot resulted in a blank screen and an eventual “Green Screen of Death” (the latest successor to the infamous “Blue Screen of Death”).  This has been the most reliable PC I have ever owned, so I knew this was not a normal issue.

Enter our hybrid Cloud infrastructure to the rescue.  I was able to jump on an available computer and work in multiple web browser windows like nothing happened.  I was in Outlook Online, part of Office 365.  I was able to open and work with Word files, my Excel spreadsheet and others, all from the browser.  I rely heavily on OneNote to organize my day.  Enter OneNote Online in the browser and was working away with my most updated notes as they sync to OneDrive almost as soon as I have updated whatever notebook I am working in.

Our Line of Business applications, those pieces of software that are specific to the work we do, all run from our data center, which is also geographically redundant and backed up.  In short, with about 8 browser tabs in use at any point in time, I was back to work in no time, while recovering my damaged SurfaceBook without losing any productivity.

I had access to everything I needed, because my entire world, personal and professional, is made up of Cloud hosted applications along with applications hosted in our corporate data centers.  I lost nothing and was easily able to reload the operating system, application software and data while I busily continued on with my day.

One personal hint I will share, is that I maintain a list of all my current software applications and registration information, which makes it easy to reload everything, by stepping through my list.  Amazon Drive is my go-to Cloud storage for my personal data and I use GoodSync to keep it synchronized in real time.  My corporate data is all in my Office 365 email and our corporate databases and file shares.  I lost absolutely nothing, no data, no settings and customizations.  Everything worked exactly as designed.

I hope this little unexpected real-world business continuity exercise will help you understand the value of a hybrid Cloud infrastructure for your personal and corporate applications and data.  It’s always nice to have a well designed strategy.  It’s incredibly rewarding when it works as designed and allows you to deal with an unexpected event that could have had a catastrophic impact.

Safeguard company data on employee phones

Standard

The following was originally published on May 29, 2016 on Seacoastonline.com.

Everyone has a mobile device, be it a smartphone or tablet. One of the key questions
most business owners and managers have is whether the company should provide these tools to employees who need them or if they should allow the employee to use their own devicesafeguard, often referred to as “Bring Your Own Device.”

Whether a company should provide a smartphone and/or tablet to an employee or allow them to use their own may be driven by a number of factors. Among these factors are cost, standardization, position with the company, department and job role, just to name a few. Regardless of the decision, whether company-wide, department-wide or employee by employee, one thing you have to be sure you have in place is an appropriate mobile device policy the employee signs.

If you allow one or the other, it’s a bit easier in that you only need one policy document. If not, you will need to have one policy document for company provided mobile devices and one document for employee owned devices that are allowed to connect to company resources.

A mobile device policy document should address several critical aspects of employees’ use of these devices. These include statements related to mandatory compliance with the policy, enforcement and changes to the policy, standard definitions so there will be no confusion about what the policy refers to, supported devices, permitted and restricted uses, approved company applications, privacy and monitoring, erasure and preservation of data, sharing of devices and reporting of lost or stolen devices. Also to be included should be costs, usage, security and confidentiality as well as personal use and personal data on the same device as company data.

The policy should identify the device or devices assigned to the individual and be signed by that individual. Included in this policy of a companion policy document should be a clear statement about texting while driving, mobile device use while driving and adherence to any state laws where the employee or operating related to mobile devices. More and more states are adopting these laws. In fact, New Hampshire has one of the toughest hands-free laws in the nation. Be sure you understand it clearly.

A consideration specific to employee-owned devices is how to appropriately secure any company data on the mobile device. Any mobile device connected to company resources should be governed by mobile device management software that allows you to control what devices connect to company data as well as remotely wiping those devices in the event of loss, theft or termination of employment, regardless of the cause. The only issue with this specific approach is that any personal apps and data on the mobile device will be lost if it is wiped. Therefore, you need to be sure your mobile device policy makes it clear the employee is responsible for backing up their personal apps and data on the device as it could be lost.

Better technology is also available to help with this. Mobile device management tools exist that allow you to specify not only which devices are allowed to connect to company resources, but more importantly, what apps and data are company apps and data. This allows a mobile device to be selectively wiped, only erasing apps and data that belong to the company, leaving the personal apps and data intact. This type of technology presents a much more effective way to manage these devices while being certain company policies and data are properly safeguarded.

If you have employees using mobile devices, even if just to access email, be sure you have these things in place. If you have not been audited for these requirements yet, it’s only a matter of time, so be proactive and get these policies and technologies in place to protect your business.

 

Email archiving important for your business

Standard

The following was originally published on May 15, 2016 on Seacoastonline.com.

email-archivingIn today’s world, email has become one of, it not the most important and relied upon communication mediums in business. In addition to being a communication platform, it has also become an unintended digital file cabinet for many people.

How many folders do you have in your mailbox? Just the default of Inbox, Deleted Items, Drafts, Outbox and Sent Items? I doubt it. I have seen some statistics that suggest average mailboxes contain dozens of folders beyond the defaults and upwards of 25,000 individual messages.

While mail servers have matured over the years to support this exponentially growing use of email as a file storage medium, as well as to support the increasing use of electronic mail for near real-time communication, there are still limitations. There are numerous considerations to take in to account when using email to both communicate and store information for future reference.

While most mail systems will support extremely large mailboxes, some to the tune of tens of gigabytes and hundreds of thousands of items, the computer you are working on could cause performance issues, even though you have not theoretically reached the limits of your mail system. An example of this would be a large mailbox, say in the area of 100,000 mail items and 20 gigabytes in size. With most mail systems, the mail is stored both on your server as well as a local cache, that is a copy of what’s on the server that is actually stored in a single file on your computer’s hard drive.

With a large mailbox, this generates a lot of read and write requests to the data and may put a heavy load on your computers resources. The nature of this activity benefits from newer, faster hard drives like solid state drives, as opposed to the traditional mechanical drive. Random access memory, RAM, also helps in this case. A computer that has worked quite well for years may seem to be underperforming when dealing with a very large mailbox. You may see your email be slow to work through, your email application periodically pauses, search not function reliably and other symptoms. It can be a very frustrating experience.

Email archiving is a solution to many of these issues and more. Email archiving is often misunderstood, as many email applications have an archiving feature, yet that’s not the same as archiving in the context of this article. Email archiving within an email application simply involves copying messages based on a range of criteria from the original mailbox to an archive copy and then removing those messages from the archive copy. While this makes the original mailbox itself smaller and easier to work with, it simply transfers a given number and size of email from the mailbox to this archive, which is also sitting on the local computer hard drive. Thus performance impacts may still be in play as these archives continue to grow just as the mailbox itself has been.

True business class email archiving involves setting retention rules on the mailboxes to automatically clean out email messages over a certain age, say anything 18 months old and older. These messages are automatically captured into an online archive that is outside of and separate from the mail server and your mail application. This keeps the size of your mailbox within established best practices limits while ensuring you are able to search and find historical email reliably and efficiently.

What’s unique with archiving like this is that your current mail is also available in the archive as soon as it is received or sent. While you may still use your mail application to manage and search for messages within the time limits that are enforced, in this example 18 months, you would also be able to rely one hundred percent on your email archive to search or any mail message. Another unique element of email archiving is that you are able to search for messages by conversation or topic. This allows you to bundle up all mail messages related to a particular topic and recover them a single file. This can be very valuable when dealing with contact negotiations, legal issues and more.

With email being so heavily relied on, problems can’t be prevented all the time, but as companies grow and email usage increases, archiving is an excellent protocol to insert to help manage your growth while ensuring people can do their work efficiently

30% of data backup disasters caused by accidents

Standard

backup

The following was originally published on April 3, 2016 on Seacoastonline.com.

Did you know this past Thursday, March 31 was World Backup Day? It was and hopefully you believe in good backups to protect your data, whether personal or professional.

World Backup Day is a private initiative that seeks to increase awareness of the importance of data in our lives. Because of this, backup is no longer a luxury, it’s an absolute must to protect our digital filing cabinets, photo albums, electronic medical records, document management systems, databases and more. I love the tag line for World Backup Day: “Don’t be an April Fool, Be prepared. Backup your files on March 31.” I’d actually change it just a bit, to say not only March 31, but every day of the year.

With the advent of imaging technology and online backup, backing up your data is far simpler than it used to be. I remember the days of managing multiple sets of backup tapes, one that would stay in the office and one that would go to the bank safe deposit box for secure off-site storage. These systems were extremely expensive and manually intensive, not to mention very prone to human error. Only larger companies could afford to maintain a proper backup rotation that would ensure they were keeping enough backup copies to protect the business. For individuals, backing up was nearly impossible as you would need multiple disks of one form or another, to copy all your data for backup. This was a very manual process and extremely prone to human error, most frequently in the form of simply forgetting to do it.

Fast forward to the present and USB drives with massive capacity and online backup services make this process for more efficient and reliable. Today, most organizations employ a form of imaging to take snapshots, digital copies, of their data several times a day and then copy these backup images to secure off-site data centers. Most organizations also employ encryption to secure these images as they are created so as they get transferred to the off-site data center they can’t be intercepted by hackers or electronic thieves. These systems ensure data is protected and in the event of loss or corruption, able to be quickly restored. In many cases, organizations also have the ability to restore their data in the Cloud and make it accessible, on a temporary basis, to keep business operations in tact when there is a larger problem with the source network.

For individuals, there are numerous backup services that run continuously on your computers to backup your data as you create it, in real time. The key to these services are that they are highly reliable and fully automated, so as long as your computer is turned on and connected to the Internet, the backup software is protecting you with little to no interaction required. Gone is the risk of human error. In the consumer market, these services are also extremely inexpensive, so if you can afford a computer, you can afford to backup your important data.

Why you should backup should be obvious, but the simple fact there is a World Backup Day underscores the point that even though most computer users do backup, there is still tremendous room for improvement. Consider some of these risk factors: more than 100 mobile devices are lost or stolen every minute. One in 10 computers is infected with some form of virus of malware, 30 percent of people have never backed up and 30 percent of disasters are caused by accidents. Don’t be someone who contributes to these statistics.

My wife is a professional genealogical researcher and backup and proper preservation is paramount to her work. Decades and even centuries old documentation is being converted to digital form in the hopes of being able to preserve this important information for generations to come. As we continue to migrate into an increasingly digital world, it’s critical that we preserve critical records for the future. Imagine researchers of the future not being able to find historical data about people, companies and entire civilizations. So even though World Backup Day has come and gone, make every day World Backup Day. The future will thank you.

%d bloggers like this: