Welcome to the Frontpage Since 1981, Computer Technology Review has been an authoritative source on data storage and network technologies. Today, we cover emerging technology and solutions in ediscovery (or e-discovery), compliance, virtualization, data security, backup, and disaster recovery. http://www.wwpi.com/index.php?option=com_content&view=frontpage Mon, 22 Dec 2014 14:07:20 +0000 Joomla! 1.5 - Open Source Content Management en-gb Verizon 2014 Data Breach Investigations Report: Change Auditing on Guard against Insider Misuse http://www.wwpi.com/index.php?option=com_content&view=article&id=17783:verizon-2014-data-breach-investigations-report-change-auditing-on-guard-against-insider-misuse&catid=322:ctr-exclusives&Itemid=2701741 http://www.wwpi.com/index.php?option=com_content&view=article&id=17783:verizon-2014-data-breach-investigations-report-change-auditing-on-guard-against-insider-misuse&catid=322:ctr-exclusives&Itemid=2701741 altby Michael Fimin

The Verizon Annual Data Breach Investigations Report is one of the most expected computer security reports of the year. Based on actual incidents of data breaches, the Verizon Data Breach Investigations Report shows the actual state of IT security by analyzing the trend in security incidents. One of its chapters, “Insider and Privilege Misuse,” is devoted to describing the mechanisms used for compromising organizational intellectual property. According to the Verizon 2014 Data Breach Investigations Report, corporate information is one of a company’s most valuable assets, and the company’s ability to control access to sensitive data and its dissemination inside and outside of the company can determine its superiority over competitors, and therefore, its future in the market. That is why corporate information is often compromised by unscrupulous employees for personal benefit or for other reasons. This article takes a close look at the Verizon 2014 Report’s chapter on insider threats to follow the main points stated by the Report and explain why tracking changes is so necessary for strengthening security and protecting systems against insider misuse.

kim_borg@wwpi.com (Kim Borg) frontpage Mon, 15 Dec 2014 19:09:06 +0000
Hyperconvergence 101: How to Tame Data Center Chaos http://www.wwpi.com/index.php?option=com_content&view=article&id=17780:hyperconvergence-101-how-to-tame-data-center-chaos&catid=210:ctr-exclusives&Itemid=2701757 http://www.wwpi.com/index.php?option=com_content&view=article&id=17780:hyperconvergence-101-how-to-tame-data-center-chaos&catid=210:ctr-exclusives&Itemid=2701757 altby Scott Lowe

There’s a new data center technology on the block and this time, it’s the real deal; it’s not a fad or a flash in the pan, but actually has the potential to reshape how many organizations deploy and manage their business-critical data center environments. Called hyperconverged infrastructure, this technology has actually been around for a couple of years, but gained new attention at VMworld 2014 with VMware’s announcement it was jumping feet first into the hyperconvergence waters.

While there is no technology out there that can be considered a silver bullet that will instantly solve all of the problems that ail today’s information technology environments, in many situations, companies may find that hyperconverged infrastructure can help make great strides toward achieving the ever-elusive goal of IT and business alignment. In other words, with the right approach, hyperconverged infrastructure may be able to help bring sanity and flexibility to what can often seem to be chaotic and rigid technology environments.

The State of Play
Most data center environments today bear the following characteristics:

  • They are heavily virtualized. In fact, the virtualization layer has become the de facto standard layer upon which new business workloads are deployed.
  • They face performance issues in some area, generally around storage. Storage is truly the Achilles’ heel of the data center. Although this critical resource has gotten a lot of attention in recent years, there remain serious storage challenges, particularly in organizations attempting to implement modern applications, such as virtual desktops and data analysis tools.
  • There are management interfaces galore. Walk around the typical data center and one thing becomes abundantly clear: there are different management interfaces for every component, including servers, the hypervisor, storage, load balancers, WAN optimizers and a whole lot more.
  • Resource management looks like a series of silos. Due to the resource-centric nature of many data centers, it can be difficult to find all of the requisite skills sets to manage the environment in a single individual. As such, many companies are forced to hire a lot of people to manage a lot of different resources. This need adds significant costs and complexity to the equation and increases intra-team communications issues that can plague productivity.
  • Scaling is an “event” rather than routine. Most companies will need to grow their data center environment at some point, whether because they need more due to increased storage capacity requirements or a need for more servers and RAM. Unfortunately – and this is particularly true for storage – increasing capacity and performance is seen as an event that needs to take place with careful planning and potential downtime— and maybe even some risk.

For some, these characteristics may not be situations that require remedy, particularly the fact that virtualization is so popular. However, for many, the last four characteristics on this list are considered detrimental to the overall ability of IT to meet the needs of the business and for the business to, as flexibly as necessary, to meet current and future challenges. As businesses become increasingly dependent on data center services, thinking about scale as a routine event will become more critical to the success of the business.

How Hyperconverged Infrastructure Can Help
Again, it’s important to realize that every organization is different. What works for Wayne Enterprises may be exactly the wrong solution for CHOAM right down the street. However, for a wide swath of the market, from the SMB to the small enterprise – and even beyond – hyperconverged infrastructure has the potential to solve many of these challenges.

Heavy Virtualization
Every hyperconverged option on the market starts with the assumption that an environment is virtualized. This is not a physical server play, by any means. Of course, that doesn’t mean that a company has to be 100 percent virtualized for hyperconvergence to be a viable option. It just means that the hyperconverged environment will rely on virtualization and the company will need to migrate already-virtualized workloads, convert physical workloads to virtual, or implement a hyperconverged environment alongside an existing system.

Hybrid Storage
A lot of companies continue to rely solely on traditional hard disks for storage. Although hard disks have been getting bigger year over year, their speed has pretty much flatlined, even as more demanding applications have been introduced. Flash storage carries the promise of fixing storage performance issues, but it still carries too high a capacity cost to be viable for many. Most hyperconverged infrastructure solutions take a hybrid approach to storage, implementing both regular hard disks for big capacity coupled with solid state disks for big performance. Behind the scenes, the software that powers the hyperconverged solution handles the heavy lifting of accelerating all of the storage by using the solid state disks as either or both a performance tier or a huge cache that sits in front of the spinning disk.

Simplified Administration and Automation
Every resource in the data center has its own management interface. Different hyperconverged infrastructure vendors go to different lengths to reduce the number of consoles that an administrator needs to deal with throughout the day. Regardless, any reduction in touch points in the environment is generally going to simplify operations and make tasks more efficient, leading to reduced costs, faster time to value and more flexibility.

For many organizations, automation of routine tasks is a critical goal to save money, reduce the potential for human error and to realize faster time to value for new services. With hyperconverged-powered software-defined infrastructure, organizations have the opportunity to automate repeatable tasks. Shifting the data center paradigm from one based on “bottom-up” element-level management to a “top-down” workload-level management goes hand-in-hand with the software-defined movement.

The “Infrastructure Engineer”
Whereas traditional data centers often employ many people to manage individual resources, virtualization and hyperconverged storage may herald the rise of an engineer with a focus that spans the entire data center. This infrastructure engineer does not generally need to have deep technical understanding of every resource area, but needs to understand overall infrastructure outcomes. This person will need to know how to, for example, create virtual machines, but doesn’t necessarily need to know all of the nuts and bolts of the storage environment since much of the complexity is shielded from the user by a comprehensive software layer.

Scaling as a Routine Operation
Finally, hyperconverged infrastructure is built for scale. Need more storage? Need a bit more memory? Just add another building block. The existing environment will simply assimilate the new resources without disruption to the rest of the services. Better yet, this ability to “scale small” means that companies get to add resources in bite-sized chunks, reducing the amount of upfront capital spend that needs to take place.

The market often sees technology that has the promise to reshape IT, but it’s rare for that promise to turn into potential. With hyperconverged infrastructure, both the promise and the potential are there for many of today’s mainstream and emerging workloads.

Scott D. Lowe is an enterprise IT veteran with more than 20 years experience in senior and CIO roles and is co-founder and senior content editor and strategist at Actual/Tech Media.

kim_borg@wwpi.com (Kim Borg) frontpage Thu, 11 Dec 2014 18:31:07 +0000
MBaaS Bolstering the Mobile App Industry http://www.wwpi.com/index.php?option=com_content&view=article&id=17773:mbaas-bolstering-the-mobile-app-industry&catid=317:ctr-exclusives&Itemid=2701734 http://www.wwpi.com/index.php?option=com_content&view=article&id=17773:mbaas-bolstering-the-mobile-app-industry&catid=317:ctr-exclusives&Itemid=2701734 altby Vicky Harris

What makes your smartphone so smart? Whether you’re an Apple fan or an Android user, it’s all about having more capabilities at your fingertips. And as the once-novel smartphone has quickly become a staple in our lives, a technology you may have never heard of has been behind the scenes making it all easier, facilitating the explosion in mobile capabilities: Mobile Backend as a Service (MBaaS). From app development to social media integration, MBaaS gives developers the tools to constantly push the limits of what today’s shiny plastic rectangles can do.

kim_borg@wwpi.com (Kim Borg) frontpage Mon, 08 Dec 2014 19:49:46 +0000
Software-Defined Storage Boosted through PCI Express http://www.wwpi.com/index.php?option=com_content&view=article&id=17763:software-defined-storage-boosted-through-pci-express&catid=334:feature-articles&Itemid=2701754 http://www.wwpi.com/index.php?option=com_content&view=article&id=17763:software-defined-storage-boosted-through-pci-express&catid=334:feature-articles&Itemid=2701754 altby Mani Subramaniyan

The term software-defined storage (SDS) implies software managing the deployment and management of storage resources across a fabric, with on-demand, per-session or static configured allocations. But such deployments, available hardware and software infrastructure, and vendor offerings all differ widely in terms of management features and performance characteristics, along with other challenges storage-system developers face. Simplifying the task for them is a PCI Express (PCIe)-based fabric initiative now underway, providing an architecture that’s built from the ground-up to support SDS. Converged fabrics like this can be deployed using the universal PCIe interconnect. The fabric allows:

kim_borg@wwpi.com (Kim Borg) frontpage Tue, 02 Dec 2014 10:46:14 +0000
Five Ways Two-factor Authentication Easily Adds Security to Login Procedures http://www.wwpi.com/index.php?option=com_content&view=article&id=17755:five-ways-two-factor-authentication-easily-adds-security-to-the-login-procedures&catid=322:ctr-exclusives&Itemid=2701741 http://www.wwpi.com/index.php?option=com_content&view=article&id=17755:five-ways-two-factor-authentication-easily-adds-security-to-the-login-procedures&catid=322:ctr-exclusives&Itemid=2701741 altby Dean Wiech

Organizations large and small can easily add security to their login procedures with two-factor authentication, which is a simple process that requires users to enter more than one piece of information to access accounts. For example, in addition to simply entering a user name and password, two-factor authentication requires use of another identifier, such as a smart card or a PIN code.

Major organizations are making use of two-factor authorization (Twitter and Google, for example). And while its primary goal is to improve security of systems and applications, the solutions also provide additional features that can be of benefit to all organizations. Here are some of the uses, and features, of two-factor authentication that can benefit employees and their organization:

kim_borg@wwpi.com (Kim Borg) frontpage Wed, 26 Nov 2014 00:46:09 +0000
Hackers Are Winning at Hide-and-Go-Seek http://www.wwpi.com/index.php?option=com_content&view=article&id=17746:hackers-are-winning-at-hide-and-go-seek&catid=322:ctr-exclusives&Itemid=2701741 http://www.wwpi.com/index.php?option=com_content&view=article&id=17746:hackers-are-winning-at-hide-and-go-seek&catid=322:ctr-exclusives&Itemid=2701741 altby Steve Lowing

Massive customer data breaches create urgency around managing endpoints and improving detection capabilities.

As information and security technology professionals, we have an insider’s view when news about massive data breaches makes the front page. But think about it from the customer perspective. How many times in the last year have you had to change your credit card number and the autopay accounts linked to it? With the rise of massive credit card data breaches at big box and online retailers, consumers are routinely exposed to the risks of identity theft, fraudulent credit card charges, and theft of bank account funds due to compromised PIN numbers. Conscientious consumers may feel they are on constant alert, checking their accounts daily, changing passwords, and minding where they shop. Even if they aren’t this careful, consumers feel stressed and inconvenienced.

kim_borg@wwpi.com (Kim Borg) frontpage Thu, 20 Nov 2014 00:52:58 +0000
Five Tipping Points for Moving to a Next-Generation Manager of Managers (MoM) http://www.wwpi.com/index.php?option=com_content&view=article&id=17738:five-tipping-points-for-moving-to-a-next-generation-manager-of-managers-mom&catid=331:ctr-exclusives&Itemid=2701750 http://www.wwpi.com/index.php?option=com_content&view=article&id=17738:five-tipping-points-for-moving-to-a-next-generation-manager-of-managers-mom&catid=331:ctr-exclusives&Itemid=2701750 altby Ken Fuhr

Two decades ago, on the cusp of the Internet bubble, a technology dubbed “Manager of Managers”(MoM) burst on to the scene. These systems promised a “single pane of glass” that would aggregate streams of telemetry (events/alarms) emitted by myriad routers, servers, databases and applications across sprawling IT infrastructures by applying rules and filters to help determine the root cause of failures.

These legacy MoMs worked well enough back then – but that was long before virtualization, mobile, cloud, DevOps and so many other innovations that completely up-ended historical models of IT.

kim_borg@wwpi.com (Kim Borg) frontpage Tue, 18 Nov 2014 17:00:50 +0000
Building Beyond the Buzz: Infrastructure Virtualization in the Cloud http://www.wwpi.com/index.php?option=com_content&view=article&id=17730:building-beyond-the-buzz-infrastructure-virtualization-in-the-cloud&catid=317:ctr-exclusives&Itemid=2701734 http://www.wwpi.com/index.php?option=com_content&view=article&id=17730:building-beyond-the-buzz-infrastructure-virtualization-in-the-cloud&catid=317:ctr-exclusives&Itemid=2701734 altby Adam Leventhal

Vendors attach descriptors like “virtualization”, “cloud” or “big data” to such a wide variety of products that the terms have been stretched too far to fit snugly on any comprehensible definition. Hype plays no small part. Who would want an iDisk when iCloud now stores your data in “the cloud”? The appeal of “software defined storage” is clear compared with storage defined by what? Godless hardware? More recently we’ve been asked to contemplate the “data lake” — a far more idyllic descriptor for undifferentiated data more typically thought of as massing in heaps and piles.

The datacenter is undergoing a major redistricting. Underlying the hype are major changes for consumers and the enterprise. The terms are confusing because we’re still struggling as an industry to describe the precise parameters of those changes. Indeed while the general direction is clear, the best path is debated, and the destination is only vaguely known. The primal pieces of IT are not changing — persistent storage, computation and communication. The way those pieces are packaged, assembled and managed are changing dramatically. Cloud storage is still persistent storage, and virtual machines still execute programs. We’re changing how we define those abstractions, where the lines are drawn between the components, the interfaces to them, and the management of them.

kim_borg@wwpi.com (Kim Borg) frontpage Tue, 11 Nov 2014 16:29:46 +0000
Generating Standardized Reports on Unstructured Data from Multiple Sources http://www.wwpi.com/index.php?option=com_content&view=article&id=17709:generating-standardized-reports-on-unstructured-data-from-multiple-sources&catid=331:ctr-exclusives&Itemid=2701750 http://www.wwpi.com/index.php?option=com_content&view=article&id=17709:generating-standardized-reports-on-unstructured-data-from-multiple-sources&catid=331:ctr-exclusives&Itemid=2701750 altby Sergey Sinkevich

Empowered by present-day technology, the pervasive necessity to enter new analytical dimensions dictates an urge for handling multi-source, multi-format data in a standardized and well-organized manner. End users want the ability to determine patterns and reveal tendencies in the avalanche of data right from their laptops or mobile devices. This holds particularly true for such industries as pharmaceutical, healthcare or defense, where objectives of decision analytics go far beyond accurate market analysis or profound understanding of sales trends.

But what does it take to confront the diversity of data formats (we are talking about dozens and hundreds of types) and bring them to a unified presentation? Below are the three steps to that should be taken from effective data aggregation, through analysis and consolidation, to ready-to-use information.

kim_borg@wwpi.com (Kim Borg) frontpage Tue, 04 Nov 2014 16:55:21 +0000
The Importance of Data Analytics in the Fight against Advanced Persistent Threats and Cybercrime http://www.wwpi.com/index.php?option=com_content&view=article&id=17690:the-importance-of-data-analytics-in-the-fight-against-advanced-persistent-threats-and-cybercrime&catid=322:ctr-exclusives&Itemid=2701741 http://www.wwpi.com/index.php?option=com_content&view=article&id=17690:the-importance-of-data-analytics-in-the-fight-against-advanced-persistent-threats-and-cybercrime&catid=322:ctr-exclusives&Itemid=2701741 altby Jeff Frazier

Home Depot’s recent announcement that a cyber-attack led to a data breach, compromising the credit card data of some 56 million customers, placed the home improvement retailer at the top of a list that no organization wants to be on, but which an ever-increasing number occupy: companies whose IT systems have been hacked, and whose customers and constituents have been victimized.

From organized crime targeting financial services organizations, to state-sponsored theft of trade secrets, to terrorists targeting critical infrastructure, it seems no company or institution is immune from advanced persistent threats (APTs) – targeted cyber-attacks by unauthorized persons or entities on specific targets and conducted over long periods of time to avoid detection.

A recent report from software maker MacAfee and the Center for Strategic and International Studies estimated that APTs and cybercrime cost the world economy between $400 and $575 billion. The report’s authors are blunt: “Cybercrime is a growth industry.”

kim_borg@wwpi.com (Kim Borg) frontpage Thu, 23 Oct 2014 17:21:49 +0000