Data centers are the lifelines of all things internet. The movie that you streamed last night and the photos you have shared online work the way they work, thanks to these power-hungry data centers which squat in the outskirts of cities occupying large real estate. So what is an alternative to the non-land, non-power issue of data centers? Deploy them underwater.
Microsoft has been experimenting with this idea for over 5 years now with their Project Natick, and the results are finally out. Microsoft’s team recently have proved the underwater data center concept was feasible and the concept is practical.
Microsoft came up with this idea of the underwater datacenter concept back in 2014. The objective here was to deliver lightning-quick cloud services to coastal populations and save energy.
The team at project Natick then deployed the prepacked datacenter units to operate lights out on the seafloor for years. The project’s Northern Isles datacenter has been humming away 117 feet under the sea in Scotland since June 2018. Phase 1 of this project ran into issues like biofouling. With phase 2 however, the team made some radical changes. According to Microsoft, Nitrogen did the trick. The atmosphere of nitrogen is less corrosive than oxygen, and with less people around the components, the performance of this project has dramatically improved from their previous run. Added to this is the consistently cool subsurface seas that allow for energy-efficient datacenter designs. The data center was deployed at the European Marine Energy Centre, a test site for tidal turbines and wave energy converters. The retrieval of the datacenter required calm seas and a choreographed dance of robots and winches. Once hauled, the container-sized data center was driven back to the garage to perform health checks on the server racks and other components. With the results in their favour, the researchers are now contemplating how to translate this success to a larger scale – on land. Team Natick thinks that this hardware will help them understand why the underwater datacenter servers are eight times more reliable than those on land.n
An extensively researched list of top Microsoft big data analytics and solution with ratings & reviews to help find the best Microsoft big data solutions development companies around the world.
An exclusive list of Microsoft Big Data consulting and solution providers, after examining various factors of expert big data analytics firms and found the equivalent matches that boast the ace qualities with proven fineness in data analytics. For business growth and enterprise acceleration getting inputs from the whole data of the organization have become necessary, thus we bring to you the most trustworthy Microsoft Big Data consultants and solutions providers for your assistance.
Let’s take a look at the List of Best Microsoft big data solutions Companies.
#microsoft big data solutions development companies #microsoft big data analytics and solution #microsoft big data consultants #microsoft big data developers #microsoft big data #microsoft big data solution providers
If you accumulate data on which you base your decision-making as an organization, you should probably think about your data architecture and possible best practices.
If you accumulate data on which you base your decision-making as an organization, you most probably need to think about your data architecture and consider possible best practices. Gaining a competitive edge, remaining customer-centric to the greatest extent possible, and streamlining processes to get on-the-button outcomes can all be traced back to an organization’s capacity to build a future-ready data architecture.
In what follows, we offer a short overview of the overarching capabilities of data architecture. These include user-centricity, elasticity, robustness, and the capacity to ensure the seamless flow of data at all times. Added to these are automation enablement, plus security and data governance considerations. These points from our checklist for what we perceive to be an anticipatory analytics ecosystem.
#big data #data science #big data analytics #data analysis #data architecture #data transformation #data platform #data strategy #cloud data platform #data acquisition
Data is everywhere. By 2025, the amount of data generated is predicted to cross a staggering 175 zettabytes. The data explosion has amplified the importance of data centres. Technology giants like IBM and Microsoft are aggressively investing in setting up data centres globally. India alone saw 14 major data centre investments in 2020 (till September).
Data centres, often referred to as the backbone of the internet, are used to store, communicate, and transport the information generated on a daily basis.
Given the kind of heavy lifting they do in the modern computing era, data centres need to be highly efficient, robust, and up-to-date. Besides, a good modern data centre has to be cost-effective, energy-efficient, and responsive. So the looming question is whether the data centres today are efficient enough to keep up with the accelerating demand?
#opinions #building modern data centre #data centre modularisation #hybrid data centre #module data centre #software-defined data centre
The Internet Freedom Foundation (IFF) recently published its recommendations for the draft on Data Centre Policy by the Ministry of Electronics and Information Technology (MeitY).
The draft, published by MeitY on November 5 this year with several policy strategies for the growth of the data centre sector in India, asked for recommendations from domain experts, public policy professionals, and students within 15 days.
In a response to the MeitY’s request, points made in the recommendations published by the IFF present valid concerns in terms of the lack of data governance and security standards in the draft.
#opinions #data centre india #data governance act #data privacy concerns #data protection bill #data saftety #data security #meity draft on data centre policy
The opportunities big data offers also come with very real challenges that many organizations are facing today. Often, it’s finding the most cost-effective, scalable way to store and process boundless volumes of data in multiple formats that come from a growing number of sources. Then organizations need the analytical capabilities and flexibility to turn this data into insights that can meet their specific business objectives.
This Refcard dives into how a data lake helps tackle these challenges at both ends — from its enhanced architecture that’s designed for efficient data ingestion, storage, and management to its advanced analytics functionality and performance flexibility. You’ll also explore key benefits and common use cases.
As technology continues to evolve with new data sources, such as IoT sensors and social media churning out large volumes of data, there has never been a better time to discuss the possibilities and challenges of managing such data for varying analytical insights. In this Refcard, we dig deep into how data lakes solve the problem of storing and processing enormous amounts of data. While doing so, we also explore the benefits of data lakes, their use cases, and how they differ from data warehouses (DWHs).
This is a preview of the Getting Started With Data Lakes Refcard. To read the entire Refcard, please download the PDF from the link above.
#big data #data analytics #data analysis #business analytics #data warehouse #data storage #data lake #data lake architecture #data lake governance #data lake management