MENU
Supermicro SuperServer E200-8D/E300-8D review
Posted by Alex Khorolets on April 21, 2017
5/5 (1)

These days, more and more companies need high-quality, reliable and efficient server hardware.  Home labs used by enthusiasts and professionals in the IT sphere for software developing and testing, studying for an IT certification, and configuring virtual environments became popular as well. Small companies are also interested in cheap and compact servers, the production of which is based on a couple of virtual machines or networking applications.

Supermicro company ranks one of the leading positions in server development for a long time. Supermicro products range from the Hi-End clusters to microservers. Recently the company released two compact servers: SuperServer E200-8D and its younger model – SuperServer E300-8D.

Supermicro SuperServers

(more…)

Please rate this

Data Management Moves to the Fore. Part 2: Data Management Has Many Moving Parts
Posted by Jon Toigo on April 4, 2017
4/5 (1)

In the previous blog, we established that there is a growing need to focus on Capacity Utilization Efficiency in order to “bend the cost curve” in storage.  Just balancing data placement across repositories (Capacity Allocation Efficiency) is insufficient to cope with the impact of data growth and generally poor management.  Only by placing data on infrastructure in a deliberative manner that optimizes data access and storage services and costs, can IT pros possibly cope with the coming data deluge anticipated by industry analysts.

The problem with data management is that it hasn’t been advocated or encouraged by vendors in the storage industry.  Mismanaged data, simply put, drives the need for more capacity – and sells more kit.

COMPONENTS OF A COGNITIVE DATA MANAGEMENT SOLUTION

(more…)

Please rate this

Data Management Moves to the Fore. Part 1: Sorting Out the Storage Junk Drawer
Posted by Jon Toigo on March 28, 2017
No ratings yet.

Most presentations one hears at industry trade shows and conferences have to do, fundamentally, with Capacity Allocation Efficiency (CAE).  CAE seeks to answer a straightforward question:  Given a storage capacity of x petabytes or y exabytes, how will we divvy up space to workload data in a way that reduces the likelihood of a catastrophic “disk full” error?

Essentially, from a CAE perspective, efficiency involves balancing the volume of bits across physical storage repositories in a way that does not leave one container nearly full while another has mostly unused space.  The reason is simple.  As the volume of data grows and the capacity of media (whether disk or flash) increases, a lot of data – with many users — can find its way into a single repository.  In so doing, access to the data can be impaired (a lot of access requests across a few bus connections can introduce latency).  This, in turn, shows up in slower application performance, whether the workload is a database or a virtual machine.

Survey of 2000 company disk storage envitonments

(more…)

Please rate this

Design a ROBO infrastructure (Part 2): Design areas and technologies
Posted by Andrea Mauro on February 24, 2017
4/5 (1)

In the previous post, we have explained and described business requirements and constraints in order to support design and implementation decisions suited for mission-critical applications, considering also how risk can affect design decisions.

Now we will match the following technology aspects to satisfy design requirements:

  • Availability
  • Manageability
  • Performance and scaling
  • Recoverability
  • Security
  • Risk and budget management

ROBO Design areas and technologies

(more…)

Please rate this

RAM Disk technology: Performance Comparison
Posted by Alex Khorolets on February 23, 2017
5/5 (1)

Introduction

Since every computer now has a volatile amount of available storage located in the RAM, when compared to other direct-access memory used for data storage, for example, hard disks, CD-RWs, DVD-RWs and the older drum memory, the amount of time used to read/write the data differs in correspondence to the physical location and/or the medium used for reading/recording (rotation speeds and arm movement) the data.

The implementation of RAM as a storage provides a list of benefits over other conventional devices, due to the fact of the data being read or written in the same amount of time irrespective of the physical location of data inside the volume. Taken into consideration all the information mentioned above, it would be a crime not to take advantage of the provided conditions.

RAM

(more…)

Please rate this

Storage HA on the Cheap: Fixing Synology DiskStation flaky Performance with StarWind Free. Part 3 (Failover Duration)
Posted by Vladislav Karaiev on February 17, 2017
5/5 (9)

We are continuing our set of articles dedicated to Synology’s DS916+ mid-range NAS units. Remember we don’t dispute the fact that Synology is capable of delivering a great set of NAS features. Instead of this, we are conducting a number of tests on a pair of DS916+ units to define if they can be utilized as a general-use primary production storage. In Part 1 we have tested the performance of DS916+ in different configurations and determined how to significantly increase the performance of a “dual” DS916+ setup by replacing the native Synology DSM HA Cluster with StarWind Virtual SAN Free.

Synology DS916 and StarWind

(more…)

Please rate this

AWS wants your Databases in the Cloud: Amazon Aurora offering up 5X Better Performance and PostgreSQL Compatibility
Posted by Augusto Alvarez on February 16, 2017
4.71/5 (7)

Amazon released recently the Aurora Storage Engine as a MySQL-compatible relational database service and is highly encouraging to customers to migrate from Oracle or Microsoft SQL Server to this new cloud service platform. Amazon Aurora is promising up to five times better performance than MySQL with better security, availability, and reliability of a commercial database and a 10% cost of what organizations are paying. And also announced a short time ago, PostgreSQL compatibility (available as a preview).

Amazon Aurora

(more…)

Please rate this

Setting yourself up for a success with virtualization
Posted by Michael Ryom on February 16, 2017
5/5 (1)

I am going to try to address a few issues I have seen quite a lot in my virtualization career. It is not that you have to take extra care when virtualizing, but your virtual environment will never be better than the foundation you build it on. The reason you do not see that many people fuss about it in non-virtualized environments (anymore). I believe, that resources are in abundance today. Well, they were so ten years ago as well, but since then we have only seen higher and higher specification on server hardware. It was the reason for starting to virtualize. Do not get me wrong – Lots of people care about the performance of their virtual and physical environments. Yet some have not set them self up for a successful virtualization project. Let me elaborate…

Mind the gap of virtualization

(more…)

Please rate this

Azure Storage New Features: Larger Blobs and New Storage Emulator
Posted by Augusto Alvarez on January 23, 2017
4.5/5 (2)

Microsoft recently announced new features and updated capabilities within Azure Storage. Some of the new features are Larger blobs, incremental copy, new API capabilities and an updated version for the Storage Emulator.

MS Azure

(more…)

Please rate this

Capacity planning with vRops
Posted by Michael Ryom on January 18, 2017
5/5 (2)

Capacity planning is one of the tasks that every IT organization need to do, but most do very poorly. This is not out of bad will or lack of skills. Most often, it is because they lack a good way of dealing with all the changes, past, present and future. Most of them are also done reactively. Statistics may be pulled from vCenter and put into word or excel where graphs of past data points from a historical trend. This is then used to predict the future growth and based there of cluster sizing and purchasing decisions are made. Alternatively, the all too familiar, “we are out of resources. Hurry we need to buy more”, scenario comes into play. None of these capacity technics are very good. There is most properly a need to do things smarter.

vROPS Health Risk Efficiency

(more…)

Please rate this