To get a recording of this webinar, click the link below.
With abundant data, its permeation has taken place in almost every business and especially Telecom Tower companies are actively utilizing data management systems to gain a competitive advantage.
However, data reliability raises challenges for Towercos when the inhouse data inexpertise or inadequacy of solution providers in breaking down data into consumable chunks stops them from taking the best business decisions. Thus, what is believed to help them achieve a competitive edge lands them in a territory of confusion and adds to their struggle of solving business problems.
In this webinar you will discover:
Watch this webinar to get insights on the ways to unlock the value of the data you are collecting.
Question1: The vast pool of data available today is overwhelming. What exactly should be an org’s first step in building a comprehensive data strategy? Are there any categories in which we should be breaking down data to reap results?
The organisations today have a focus on operational excellence and to achieve that they need to make digitization their competitive advantage. The data coming from tower site sources are at best 40-60% usable so by adding steps to the processing so that what is used for data modelling and analysis is of a much higher quality than what is received. So, data can be taken from 60% accuracy to 80% accuracy by simply adopting these rules.
To get results you must look into the following four steps, first is collecting data and then translating it into common formats to use it. Use algorithms to identify and flag data inconsistencies and use it to fine-tune the discrepancies which arise in the data set.
Question 2: What are the hardware requirements to turn the data into valuable insights?
There can be multiple sets of data that can be gathered from the site in order to assess the site performance and make the operations more efficient. This data can be classified into different buckets and can be collected using different kinds of hardware that can be deployed on the sites.
1) Alarm Data – Any anomaly detected on the sites, can be reported to the back end platform. These alarms can be collected from different sensors deployed on the sites or tapping these from the controllers of the sites installed (DG Controller, AC Controller, Integrated Power Management System, Door Sensors, Fuel Sensors).
2) Energy Data – To assess the performance, smart meters can be deployed on the site. These meters can help in understanding the energy consumption patterns and running hours of the equipment. There can be different types of meters that can be deployed depending upon the site configuration ( Single Phase or Three-phase). Meters can be Alternate Current Energy Meters ( 230V) or Direct Current meters ( 48V).
3) Surveillance Data – IP Camera can be installed on the site to have real-time site information. These snapshots and videos can be assessed to understand the movement of people on the site and other security and performance-related parameters
4) Access Data – With smart locks installed on the site, site access management can be automated. This can help you understand who has access to the sites and for what duration. Site security can be tightened using smart locks, as only people who have access, can enter and work on the sites.
5) Various Environmental Parameters – Sensors / Data – loggers can be installed on the site, to collect the temperature and humidity data. Temperature and humidity play an important role in efficient performance on the site.
6) RMS (Remote monitoring Solution) – This is an amalgamation of all the hardware/ sensors connected to one unit called as Remote Terminal Unit. This collects the data from different sensors installed at the site and sends it to the platform where data is processed, thereby helping in analysing and monitoring asset performance in real-time.
Question 3: Are there any roadblocks one should be aware of that can occur while sourcing and gathering data? If yes, then how to scale it?
A frequent problem that comes up is having multiple versions of data. The different departments inside any organisation depend on data collected by them. But, it’s important to have a data lake where all the data is brought in and compiled to create a single source of truth before it is taken further for analysis, process improvement and decision making.