Read the latest articles and expert opinions from the SysMech team
2020 sees the launch of Zen 7, the latest release of our software platform, continuing the Zen evolution and bringing major new features and enhancements including cloud native deployment, support for 5G, Smart Energy IoT and large enterprise IP networks.
On premises or in the cloud, Zen 7 runs on bare metal, data centre VMWare vSphere or Cloud (AWS). Our continuous strategy to containerise and move to Kubernetes and Helm lowers data centre and cloud costs through best-in-class orchestration.Zen 7 has all of the benefits of an evolving architecture and technology stack including:
Cloud native architecture is crucial for the long term success of 5G and is the foundation for 5G Innovation. Zen 7 provides service assurance and data analytics for the 5G era whilst addressing the challenges of operating in a hybrid environment. Zen is already providing service assurance functions for 5G networks along side different generations of technology operating on traditional physical networks and cloud based network virtualisation (NFV/SDN/VNF) for RAN / OpenRAN, Transmission and Core.
Zen is evolving to support the transition to 5G Stand Alone, and the service scenario based opportunities it presents. These services will be implemented using end-to-end dynamic network slicing, providing ultra reliable low latency communications (URLLC), massive machine type communications (mMTC) or enhanced mobile broadband (eMBB), that must meet the specific SLA requirements of the vertical industry markets. The opportunity for CSP’s is enormous, however this comes at the price of exponential increases in complexity and data volumes produced by telecoms networks. Key to managing these new types of customer services is AI, Machine learning and automation based on distributed data analytics operating at different velocities and locations in the network, combined with service assurance / root cause analysis covering all the different physical and abstracted layers involved in dynamic network slices and assuring the customer’s SLA.
Designed from inception to address these types of requirements Zen is now evolving to cater for the increasing demands for 5G Stand Alone network service assurance, analytics and automation.
Zen 7 includes major enhancements to the SNMP management functions. By providing insight into the performance of large, evolving, enterprise and safety critical networks, Zen helps to assure adherence to demanding SLAs.
An upgraded SNMP Poller offers enhanced monitoring and alerting functionality to provide up to the minute monitoring of a vast number of device types. each with thousands of instances across the network and providing visualisation through the feature rich reporting available in Zen.
Zen implements a network of distributed SNMP probes to allow monitoring and reporting across the security layers of a large enterprise network. Using industry standard SNMP technology, Zen assures reliability and broad compatibility with existing infrastructure to provide a vendor agnostic view of the operation and performance of your IP network.
With Zen 7 our adaptable framework has been enhanced to support Digital Integration Platform (DIP) applications for Smart Energy Management and IoT applications.
The Zen DIP acts as a fully scalable device and application enablement platform. This facilitates command and control, providing monitoring and Insight into the metrics from IoT and M2M devices, controllers, sensors, and hubs. Zen can scale to Integrate with any number of devices, monitoring for deviations from model-based thresholds and proactively identifying anomalies from expected behaviour on Impulse . Zen can utilise its in-built Intellect to automate actions or alerts when co-existing conditions arise, helping to build new services or initiate automated activities.
Zen 7 enhances the comprehensive features and deep integration provided by the Zen platform to address all aspects of enterprise security. Zen uses an identity store (Window/Linux) to manage user and group information and to determine access rights to the platform. There are two kinds of identity stores; Local implemented in Zen Server; External authentication technologies such as Active Directory, OpenLDAP, SAML, or OpenID, including SAML with F5 BIG-IP and Microsoft AD FS (Active Directory Federation Services). Authentication verifies a user’s identity and this is then used to grant authorised access to Zen—whether to manage the server, or to publish, browse, or administer content.
Zen provides manager-of-managers functionality for Fault Management, Performance Management and Customer Experience Management. An extensive range of interfaces are available off-the-shelf for integration with Network Elements/Element Managers (physical and virtual) and 3 rd Party OSS/BSS systems in the customer’s environment.
Zen 7 introduces an enhanced SNMP service for integration with north bound operational systems. The following integration points are supported:
Zen can monitor KPIs based on any individual metric or perform rapid calculations across multiple metrics that form a KPI. The results are evaluated for deviation from group or entity specific thresholds, generating SLA performance alerts for any KPI breach. The Zen 7 Northbound SNMP Service is now available to integrate with North bound systems to create performance KPI / SLA breach alerts.
Zen 7 introduces a new planned outage microservice that utilises the Zen Northbound SNMP service. The microservice can be configured so false alarms can be avoided during planned works, equipment commissioning and maintenance.
Machine learning is a prominent topic in the field of telecommunications right now. With the promise of automated operational efficiencies, personalised customer offerings and fraud mitigation, it’s no surprise that operators are investing in this advancing technology. In this blog, we discuss what they actually need to prove successful with machine learning.
One of the things we know is essential to machine learning is data, and lots of it! For telecoms operators, this usually isn’t a problem; there are more mobile devices than people across the globe, and behind every mobile device there is masses of data. Network operators have access to call detail records, service usage statistics, network equipment utilisation, network performance metrics, customer experience metrics and so on. The challenge actually arises when it comes to accessing and interpreting this data.
For many network operators, their big data is stored in multiple places, and accessed using disparate tool sets. To build a successful machine learning model, training data that consists of a set of parameters and a known outcome is required. Building a usable training data set from multiple storage locations, when often the outcome isn’t automatically associated to the defined parameters is a time consuming and semi-manual task. As well as this, operators must be confident in the data quality, and will need to deploy automated data quality fixing processes to ensure accurate results from their machine learning models. In fact, accessing high volumes, of quality data can prove to be one of the most time-consuming parts of developing a new machine learning use case.
In a competitive commercial environment such as telecommunications, finding the right use cases for machine learning is essential. It’s a hot topic in the industry right now, but that doesn’t necessarily mean that it is the right approach for everything. Andreas Vegas, Global Big Data Director at Telefonica explained at this years’ Telco Data Analytics that Telefonica have achieved a 20% CAPEX saving via a new network deployment optimisation tool, which has no machine learning capabilities whatsoever. So, finding the right use cases where machine learning has the potential to deliver real impact is essential. Have a read of our blog, 5 machine learning applications in telecoms for a flavour of the types of use cases beginning to emerge.
Machine learning roles are usually associated with software engineers and data scientists, who have the knowledge and experience to define and build the complex algorithms required. The types of skills needed include computer science fundamentals, programming, an understanding of probability and statistics, software engineering and data modelling expertise. These skills can sit with dedicated in-house resource, external partnerships with universities or research centres, and with machine learning vendors. The telecommunications industry is no different to any other in this respect, however another resource is also required; telco data experts.
Telecoms data is notoriously complex with incoherent naming conventions, entangled relationships and trends that are difficult to decipher. To build a successful machine learning model, an understanding of telecoms data is essential, not only to understand what it means but also understand the relationship between different types of data. Programmers and telco data experts must work together to identify machine learning use cases, and define the relevant data sets.
The question of hardware requirements for machine learning is one that comes up often. Machine learning models can quite easily be built using an off-the-shelf computer, in fact Microsoft have even managed to carry out machine learning on a Raspberry Pi. But the challenges arise when it comes to actually training the model. Generally, the more data to train with means a more accurate model as a result, but it also means more processing power and longer processing times. And when you are running hundreds of data sets with small tweaks for each epoch, the processing time can soon add up.
Most telecoms operators already have access to an abundance of hardware since they usually run their own data centres. And for many machine learning use cases this can easily be sufficient, especially in the early days. If they also run a virtual environment, operators can quite easily spin-up/spin-down multiple workload processes as and when required to further resource their machine learning models. However, as machine learning use cases expand, or real-time use cases emerge, GPU accelerated hardware should be considered. A GPU offers more cores than traditional CPU’s, allowing the processing of parallel workloads much more efficiently. They are also better designed for multiple repetitive tasks with just small tweaks, such is the case with training a machine learning model. Of course, this comes at a cost; NVIDIA have designed the GDX-1 specifically for machine learning with 5 GPUs, coming it at around £100,000.
This year’s Telco Data Analytics brought together some of Europe’s leading telecoms experts, tackling topics such as machine learning, new infrastructures and the impending General Data Protection Regulation (GDPR). In this blog, we share the top 5 topics from the 2017 event.
With GDPR on the horizon, it’s no surprise that it was discussed by several speakers at this year’s Telco Data Analytics. Andres Vegas, Global Big Data Director at Telefónica, explained that privacy and personal data management are top challenges that operators must tackle ahead of the fast approaching GDPR compliance deadline of May next year. And with 89% of customers already worried about what happens with their data, along with harsh penalties for non-compliance, it’s a challenge that must be taken seriously.
Also addressing the subject, Ludovic Lévy said that Orange, where he is VP of Global Data Strategy and Governance, is looking at GDPR as an opportunity, rather than a constraint. Operators must make the opt-in process transparent for subscribers; instead of pages and pages of terms and conditions, they could take a modular approach, with ‘feature-based’ opt-in. He used the example of a recent update to Waze, a popular navigation app. Updated terms and conditions allowed the app to access a user’s calendar, alerting them when to leave for their next appointment based on traffic conditions. Although Waze had been transparent with this change, messages within terms and conditions can often be overlooked by users, and a simple ‘new feature’ opt-in would likely be better received and build users trust
Unsurprisingly, artificial intelligence and machine learning were discussed throughout the event. Almost everyone agreed that these technologies must be use case driven to ensure they actually bear a benefit to operators, whether it be financial or operational.
Perhaps one of the most polished use cases presented at the event was the use of machine learning algorithms to improve out-of-home advertising, presented by Roman Postnikov, Director of Customer Analytics and Segment Marketing at MegaFon. In this particular use case, MegaFon wanted to use their data to track movement, and customise digital billboards to adapt to the local audience. However, the telecoms data was not accurate enough to track the physical paths taken by users, so a machine learning algorithm was developed to map the path with the most likelihood. Results showed that 95% of the time, an accuracy of <25m could be achieved, and this is expected to reach an accuracy of <10m over the next 12 months as the machine learning progresses.
A common pain point that was highlighted by both operators and vendors at this year’s event was the continuing challenge of data quality and data integration. Pratik Bose, Head of Mobile Big Data Solutions at EE/BT explained that data issues are rarely spoken about, but can have a massive operational impact, especially on newly arising use cases in the fields of machine learning and artificial intelligence. Joar Anderson from DigitalRoute expanded on the topic, saying that data scientists were spending up to 80% of their time on cleaning, collecting and organising data before they could utilise it, and this was an issue that must be acknowledged and tackled head-on.
Several discussions tackled the subject of combining legacy toolsets with real-time analytics solutions. A panel featuring voices from Docomo and Swisscom saw a common question arise; what actually is real-time? For Docomo, a data latency of 30 minutes for mobile alarming applications can be classed as real-time, where others may see a delay of a few seconds to a few minutes as real-time. What was also apparent at the event was that not everything needs to be in a real-time analytics environment immediately. Instead a balance is needed between existing systems that already work well, and specific use cases within a big data application.
Finally, a topic touched on by many was the resourcing of data scientists, specifically the challenges operators face when it comes to attracting the best people. Telco’s are up against the likes of Google and Facebook, who can often appear more attractive, and their type of data is usually more relatable. In fact, in an executive panel discussing the topic of creating a sustainable analytics culture, it was highlighted that the problem may well not be about data scientists, but about data translators, providing an understanding of telecoms data which can only come from experience within the industry.
As always, this year’s event tacked some of the most pressing topics within telecoms right now, some reoccurring from last year’s event, and some newly arising challenges. What was clear is that data analytics touches multiple areas across the telecoms environment and will surely be a hot topic well into the future.
Network planning and optimisation has never been more important within the telecoms industry; new competition, shrinking profits and higher customer demand means operators need to improve their service performance, whilst at the same time minimising their expenditure. This is driving new approaches to planning and optimisation, in which network engineers are utilising more diverse data to make operational decisions with the greatest impact. Network operators are beginning to now get answers to traditionally difficult questions such as “What experience do my enterprise customers really have?” or “What is the true value of existing and new cell tower sites?”
Traditionally, network planning and optimisation teams have used network performance statistics and network equipment faults to understand the networks behaviour, and plan their future investments. However, by using this information in isolation, engineers are blinded to the subscriber impact of their activities. For example, an engineer may look at the cell towers with the poorest performance across the entire country, and flag them for maintenance or new investment. But without correlating this with subscribers in the area, or service uptake, it’s unclear whether this is in fact the best investment.
This is a well-known issue across the telecoms industry, and the problem lies in the traditional operational support systems (OSS) in place. Older OSS tools typically do not provide the mix of data required to understand subscriber activity and the impact of network planning and optimisation activities.
Luckily most operators have now embraced the need for OSS transformation, and are investing in more sophisticated tools to provide improved operational intelligence. New methods of working and new use cases are now emerging, in which operational decisions are being made based on impact, not just performance statistics alone.
One such example is taking a localised approach to network planning and optimisation to fully understand the entire situation in specific areas across the country. Using SysMech’s Zen Operational Intelligence Software, network engineers are creating a ‘dynamic grid’ view of the country, from which they can see key performance metrics, drill down to a ‘zoomed in’ local grid and identify the subscribers in the area alongside new business opportunities and profitability. With this information, they can then make better informed decisions based upon the impact on subscribers and potential new business revenue.
Firstly, the country is split into 10km grids, displaying a RAG status for a key metric, such as call set-up success rate, along with sizing based on a secondary metric, for example combined data consumption volumes. These metrics are completely dynamic allowing network operators to compare various elements of the networks performance and usage.
Areas of poor performance with a high user base can then be zoomed in on to provide more detail on that specific locality. Network engineers can see the same metrics for a 1km grid square, along with the location of the cell towers and key users such as enterprise customers. With the click of a button, network engineers can also bring up information on the opportunities within the locality, including private and public businesses, how many subscribers have used the cell towers within the grid, and spots with the potential for new cell tower deployment. They can also see the profitability of existing cells in the locality to get a better understanding of how the area currently performs.
With all this information in one view, network engineers can rapidly identify the local regions with the greatest potential for new investment or optimisation. Key decisions can be made, knowing how subscribers will be impacted. In one example, a feed of handset application data identified a dense area of total service loss. With localised investigation, the operator could easily identify the cause of the problem; a large hospital with poor indoor coverage. They could then make the decision to deploy a small cell in that area to improve indoor coverage. Small cells have a much lower cost than new cell towers, and deliver significant impact on indoor coverage.
This type of network planning and optimisation is now being implemented by many network operators, and numerous use cases are beginning to emerge, demonstrating real, measurable impact on both customer satisfaction and return on investment. Head over to the SysMech website to see more impact led use cases.
With the launch of 5G planned for 2020, it is the year we are all talking about! But what impact will it have on the telecoms landscape of the future? In this blog, we look further ahead to telecoms networks in 2030.
2030 may seem a long way off, but twelve years can fly by pretty quickly. After all, it was only twelve years ago that President Bush was in power, the Nintendo Wii was released and Pluto was downgraded as a planet. In the world of telecoms, a lot can change in that time, and with 5G now on the horizon, what will the telecom networks in 2030 likely look like?
If we look back on 4G, the time between commercialisation and widespread coverage wasn’t very long at all. Commercialised in 2010, just 5 years later and 4G was available to over a third of the global population. So, if 5G follows in its footsteps, and stays on track to launch at the 2020 Olympics, by the time 2030 arrives it will undoubtedly be widespread across the globe.
And with 5G firmly in place, it’s likely that most, if not all 2G networks will be switched off by 2030. In fact, some operators have already got the ball rolling with Singapore, America and Australia leading the way. However, it looks like the retirement of 2G across Europe may be a little slower; 2G networks still remain profitable and offer the opportunity to carry M2M communications. Telenor Norway has suggested that it will actually shut down its 3G network before retiring 2G. It predicts that 3G will be retired in 2020 and 2G will not be retired until five years later in 2025. By the time 2030 arrives, it’s likely that a lot of 3G networks will have also reached the end of their era. Of course, with 2G and 3G out of the picture, Voice over LTE will be the predominant voice channel. VoLTE users are already growing in China, America and India, and VoLTE subscriptions are expected to reach 4.6 billion by 2022.
And the new buzzword in 2030? 6G of course! Since the early 90’s we have seen the launch of a new and improved generation of mobile technology every decade; in ‘91 we had 2G, in ‘98 we had 3G, in 2010 we had 4G, and (if all goes to plan) in 2020 we will have 5G. So if this trend continues, by 2030 we will at least be discussing 6G.
When it comes to fixed networks, the home phone has been on its way out for a while now. 2016 was the first year where there were more households without a home phone than with a home phone in the US. And in many developing countries, where fixed infrastructure is highly unreliable, people have skipped the home phone completely and gone straight to mobile; Ghana, Uganda, Pakistan and Indonesia all have home phone penetration rates of less than 5%. So with western countries phasing it out, and developing countries going straight to mobile, the home phone will likely be a thing of the past by 2030.
For broadband, copper is nearing its maximum capacity, and is quickly becoming seen as old tech as fibre coverage increases. Of course fixed-line networks will still play a major part in our communications infrastructure in 2030. As much as mobile speeds, coverage and capacity advances, mobile can still prove problematic with indoor coverage, and many consumers, and most businesses will still require high-speed fixed line connections. 4K video will also put a much greater strain on our networks over the coming decade which will help cement the status of fibre well into the future.
Continual, rapid change is thankfully something that communications network operators are used to, and can rapidly adapt to. And from now until 2030 will be no exception. Firstly, operational teams will have to adapt to the planning, deployment and monitoring of new 5G technology, likely to be quite a change from previous generations. Monitoring the VoLTE experience will also become a priority as it begins to replace older voice generations. Many operators will adopt value based network planning, in which they look at connectivity as a whole, rather than treating mobile and fixed-line separately. By mapping their coverage, capacity and quality in a geographical grid-like manor, operators can see coverage across all technologies. They can plan based on what is needed within a specific area and make economical investments, for example installing a small-cell rather than a new cell site may be sufficient.
In 2030, traditional network operators are also likely to be tackling a new threat in the marketplace; internet by satellite. SpaceX, a project by Elon Musk, and funded by Google, plans to begin launching over 4000 satellites into space in 2019, with the aim of providing high-speed internet on a global scale. With the promise of low latency, good speeds and improved rural coverage, it may well be one of the biggest treats that telco operates will see in the next decade.
Of course, we may well see completely new communication technologies arrive over the next 12 years, but if current developments and market trends remain on track, the telecommunications landscape could look quite different by the time we reach 2030.