'Edge' of the centre
Our networks and data centres are coming under pressure as emerging technologies push them to the limit. The demand for data and information has, and is, growing to the point where today’s systems are struggling to cope.
One of the common themes raised in discussions around the increasing burden placed upon IT infrastructure is the ‘Edge’. Unfortunately there is no simple summation of what the ‘Edge’ is, as this can differ by organisation and/or by market.
‘Edge’ Data Centres and networks in the Telco market represent infrastructure deployed at a regional level rather than being centralised. This is done to enable the delivery of live streaming media, on-demand streaming media and social networks. The key benefit to the user being better availability and performance …. after all, nobody likes to watch the buffering % climb at a snail’s pace!
The Internet of Things (IoT) has its own take on the ‘Edge’. The IoT is a network of physical objects, be they devices, vehicles, buildings or other items that are provisioned with electronics, software, sensors and network connectivity enabling the collection and exchange of data. The ‘Edge’ in this scenario can refer to the sensors that collect the data, which when processed provides information that gives a person or organisation knowledge that ultimately enables more informed decisions to be made. The huge volumes of data presented by the IoT requires real time processing which can cause latency and flexibility issues when data is stored centrally. To combat this, we are seeing some applications being decentralised to ‘Edge’ Data Centres and ‘Micro’ Data Centres. A good example of this is a high profile Australian mine that has multiple data centres at different locations which collect and process data relating to the mine operation itself, as well as controlling autonomous trains and trucks. This data is then analysed and acted upon many hundreds of kilometres away at a network operating centre.
Within the industrial arena, the ‘Edge’ often refers to the location of IT equipment that has superseded industrial controllers and proprietary protocols. The term ‘Industry 4.0’ has been referred to as the fourth industrial revolution, a new phase in the organisation and management of the entire value chain over the full product life cycle. This cycle is increasingly oriented towards individual customer requirements and extends from ideas, orders, development and manufacturing to end-customer product delivery and recycling, including all associated services. The framework is provided by networking all the elements involved in the value chain so that all relevant information is available in real time, and using the data to derive the optimum value adding stream at all times. Linking people, objects and systems produces dynamic, self-organising and cross-company value adding networks that can be updated in real time and optimised on the basis of different criteria including costs, availability and consumption of resources. Increasingly we are finding the IT equipment that drives Industry 4.0 moving out of the server room and onto factory floors and into other harsh environments which challenge the performance and operational characteristics.
For the majority, the ‘Edge’ refers to any decentralised processing environment outside of the primary data centre. This may be a remote office, warehouse or factory. It may also refer to a hybrid cloud environment where some applications may be forced to reside ‘on premise’ due to an organisation’s rules around data sovereignty, or to ensure performance is maintained. As with other ‘Edge’ locations, one of the key challenges lies in providing a holistic physical infrastructure that is secure, reliable and scalable.
The significant take up of cloud and co-location services in Australia has given many organisations a feel for what can be realised when you have millions of dollars to spend on IT infrastructure. There are now many Tier III cloud facilities built and operating around the country, and in the same sense as flying at the pointy end of the plane, once you’ve experienced how it should be done it’s very difficult to go back. Fortunately, the challenge of providing the correct physical infrastructure for IT equipment at the ‘Edge’ is becoming easier.
The concept of the ‘Micro’ Data Centre is basically taking the critical infrastructure that you’d find in a large data centre and shrinking it down into one or two racks. The solutions typically range from 3 to 12 kW in terms of delivering power and cooling. Power typically includes a single phase UPS, bypass switch and intelligent power distribution rails with switching options that may be important for remote sites. Cooling often uses an in-rack DX (Direct Expansion) split system that essentially provides its own environment that has little impact on the location in which it is placed and vice versa. Racks typically have an IP55 protection rating to limit ingress of water and dust, and at the same time, limiting potential for condensation and enabling the use of gas fire suppressant systems. As the racks often have a hybrid configuration, 800 mm wide racks provide ample room for cabling, whilst depths of up to 1200 mm accommodate the deepest of servers. Heights vary from 24 to 47 U, with focus being placed on usable rack space after power and cooling elements are taken into account. Security and resilience can be found in enclosures that offer electro-magnetic locking systems to govern access into the racks via key pads or proximity cards, and automatic door opening kits that act as a failsafe should the cooling system fail catastrophically. Monitoring of the infrastructure’s performance and security is critical, especially in the case of fire detection where options exist for early smoke detection coupled with gas flood suppressants to negate any potential threats.
‘Micro’ Data Centres will play an important role in the physical IT infrastructure market as we see IT being deployed in more remote and harsh locations. By their very nature they also lend themselves to being redeployed, extending and protecting the initial investment. Standardisation allows organisations to deliver infrastructure faster and once operational, to manage and deal with support issues in a more efficient manner. Off-site configuration and pre-commissioning also enables quality to be maintained, providing the client with peace of mind.
Enabling AUKUS with Innovative Data Infrastructure for a Secure Indo-Pacific
Innovative technologies are essential for AUKUS partners because they underpin the advanced...
Physical layer infrastructure: Hyperscale and cloud-scale
A need for increased capacity and responsiveness is driving a rapid migration to higher speed...
Your Microsoft 365 Toolkit for Content Manager
Kickstart your collaboration experience using Microsoft 365 with the confidence that you are...