Story image

Microservices: A game changer for application migration to the cloud

10 Feb 17

Reports about “microservices” are becoming increasingly common, including what they are and how enterprises are using them to run applications across cloud or in on-premises data centers.

While microservices are similar to the rapid development and deployment approaches used by DevOps teams, their adoption can be seen as the catalyst for a service oriented architecture (SOA) and cloud migration.

What are microservices?

Analysts, consultants and suppliers of applications and development tools have been discussing the concept of a common architectural style for years, but it wasn’t until May 2012 that a workshop of software architects coined the name “microservices.”

The best way to think of them is as an application architecture based upon the use of small, independent services to create larger and more complex applications and products.

Complex applications are being broken down into small pieces, or components, each of which may be developed, documented and tested in parallel by multiple distributed development teams.

This approach helps both business and development teams make products more feature-rich in a shorter amount of time. Each of the application components’ features are considered microservices, and are fashioned/tailored to work independently and cross functionally with other components.

This enables teams to execute, test, deploy and troubleshoot quickly.

Once all the functions required to build a specific application are available in the form of microservices, the application can be assembled/deployed out of those functions, quickly tested, documented and moved into live production.

These individual services communicate with one another using high-speed interconnection and can be deployed on-premises, off-premises or in a geographically distributed hybrid environment.

Issues in a specific application function can be monitored, replicated and fixed, and updates can be rolled into production with a short turnaround time.

Microservices compliment the testing process because the entire application can still function while individual components or services are being fixed and deployed.

By exercising microservices, organizations can bring up new bare-bones applications quickly and flesh them out as new functions are being developed. In large-scale systems, where new feature rollouts can happen daily, microservices are a blessing that accelerates the delivery process.

How microservices are changing the game of football

Microservices played a huge role in this year’s Super Bowl by speeding player data to NFL officials, coaches, fans and broadcast networks. Zebra Technologies’ sports data tracking system calculates NFL players’ speed, distance, closing distance, routes and formations, and then quickly adds “eventing data,” which translates all the information into a football context.

Microservices enable the capture, processing and delivery of this data with incredible speed. They capture the information about a play as it is unfolding and send it to broadcasters immediately for use in TV replays.

The Zebra system relies on RFID sensors placed around the stadium and in every player’s shoulder pads as part of the official NFL uniform, including 22 receivers in the stadium and two RFID tags on every player, two on each referee and one in the ball.

The RFID tags track movements up to 25 times a second and deliver the information, including latitudinal and longitudinal data, in about half a second. Microservices and high-speed interconnections between the Zebra systems, IoT devices and the recipients of the data can make the Super Bowl more interesting to watch in a replay than it was watching it live.

Where VMs and containers fit in

We have seen growth in the use of virtual computing environments since 2014, first with the use of virtual machines (VMs), now with the growth of containers. We also believe that the trend to develop and deploy applications based upon microservices is closely related to the move to deploy encapsulated virtual environments.

VMs make it possible for operating systems, combined with application dependencies, data management tools and applications components, to be provisioned and deployed quickly. This approach works well if the applications and components have each been written to work with different operating systems, since they can all be hosted on a single physical server without incompatibilities between or among them.

Containers are similar to VMs, but they are based on the assumption that all or each of the functions are designed to execute on a single operating system on multiple hosts or clouds.

Containers are flexible enough to enable multiple independent partitions that contain applications or their components that can be deployed under a single host operating system.

This reduces the amount of system memory and processing power required to support those independent components, thus enabling efficiency and justifiable use of infrastructure. Switching from component to component as the aggregate application executes can also be done much more quickly. This enables seamless scalability of the applications.

As with microservices, containers are lightweight. And when containers are injected with microservices, the migration across data centers and cloud is a cakewalk for DevOps and infrastructure teams.

On-premises, cloud and hybrid deployments

The evolution of containers enables enterprise applications and any distributed applications (e.g., IoT in the NFL) to be constructed using microservices, and these packaged microservices can enjoy the platform independence that separates deployment from legacy dependencies (i.e. OS, libraries, connection strings properties, etc.).

Enterprises can easily choose to deploy all or selected microservices locally or in any data center, cloud service or hybrid environment. Equinix has been on a journey using microservices and containers, and we see them increasing efficiency, ease of deployment, manageability and scale on-demand in our own and customer application deployments.

Article by Ramchandra Koty and Balasubramaniyan Kannan, Equinix blog network

Disruption in the supply chain: Why IT resilience is a collective responsibility
"A truly resilient organisation will invest in building strong relationships while the sun shines so they can draw on goodwill when it rains."
The disaster recovery-as-a-service market is on the rise
As time progresses and advanced technologies are implemented, the demand for disaster recovery-as-a-service is also expected to increase.
Cohesity signs new reseller and cloud service provider in Australia
NEXION Networks has been appointed as an authorised reseller of Cohesity’s range of solutions for secondary data.
The key to financial institutions’ path to digital dominance
By 2020, about 1.7 megabytes a second of new information will be created for every human being on the planet.
Proofpoint launches feature to identify most targeted users
“One of the largest security industry misconceptions is that most cyberattacks target top executives and management.”
What disaster recovery will look like in 2019
“With nearly half of all businesses experiencing an unrecoverable data event in the last three years, current backup solutions are no longer fit for purpose."
NVIDIA sets records with their enterprise AI
The new MLPerf benchmark suite measures a wide range of deep learning workloads, aiming to serve as the industry’s first objective AI benchmark suite.
McAfee named Leader in Magic Quadrant an eighth time
The company has been once again named as a Leader in the Gartner Magic Quadrant for Security Information and Event Management.