IoT devices are producing a variety of large amounts of data. Data must be sent to the cloud for analysis.
The growing amount of information that needs to be sent to the cloud often prevents data from being pre-processed.
Accordingly, IT experts have suggested that the fog computing paradigm, or more precisely fog calculations, be used to reduce the workload of data centers.
Cisco’s approach to applying fog computing to IoT data processing has received a great deal of attention from companies.
In this article, we are going to briefly review Cisco’s solution to the problem of large data processing.
Cisco’s approach to processing and managing the data to be processed is based on the following three principles:
- Instead of sending large amounts of IoT data to the cloud, time-sensitive data should be processed at the edge of the network and near where it generated the data.
- Any action on the data based on the set policies should not take more than a few milliseconds.
- Data that does not pose particular problem over time should be selected and sent to the cloud for analysis.
The Internet of Things has been able to improve the speed of response to events. Shorter response times in important industries such as oil and gas, welfare services, transportation, mining, and some state-owned companies can improve work efficiency, improve output, improve service quality, and provide better security.
Suppose the temperature sensor of one of the important manufacturing devices sends a failure warning shortly.
The person in charge immediately goes to the machine and repairs it before the machine breaks down.
In important industries such as petrochemicals, oil and gas, and similar samples, sensors installed on pipelines can detect any temperature change. In this case, the transfer pumps in response to this temperature change automatically reduce the transfer speed to avoid major catastrophic and environmental disasters.
Suitable infrastructure for the use of fog calculations
اینترنتInvesting on the Internet of Things means a new kind of infrastructure. Common models governing the cloud world can not properly manage this amount of rapidly generated information.
Today, billions of different devices that were previously unintelligent produce about 2 exabytes of data per day. A few years ago, Cisco predicted that by 2020, nearly 50 billion devices would be connected to the Internet and that conventional infrastructure could not send that amount of data to the cloud.
On the other hand, data centers are not able to analyze this amount of data in the blink of an eye and send information to endpoint devices, because the current bandwidth does not correspond to this amount of information exchange.
Conventional cloud models also face many limitations for processing this model of data that is generated at any given time.
The objects that connect to the Internet today are heterogeneous and each has its own functions.
For example, some devices use industrial protocols instead of IP addresses to connect to controllers, so they must be assigned an IP address before sending data to the cloud for analysis or storage, and then the data as soon as possible.
Be analyzed and returned to the target devices.
For example, when the temperature in chemical reservoirs quickly exceeds the set threshold, the relevant authorities should take appropriate action immediately.
The time spent reading the temperature and transferring data from the edge to the cloud and then analyzing the data eliminates the possibility of any precautionary action.
For this reason, Cisco has proposed to use a special computational model for this data model that has the following characteristics:
Minimize latency :
Insensitive locations, data must be processed near the equipment that generated the data, thus preventing disaster and system failure. In some industries and some services, even a few milliseconds is very important.
Conserve network bandwidth :
In some industries, such as oil rigs, something like 500 gigabytes of data is generated each week, which in the aviation industry is a few terabytes per hour.
Transferring this amount of data from edge equipment to the cloud is not logical, and given that some analytics do not require cloud storage and processing, doing so is just a waste of bandwidth.
Address security concerns :
IoT data must be encrypted when sent and received. The encryption process must be performed throughout the send and receive chain so that any attack on IoT equipment does not distort or corrupt the information.
Operate reliably:
Today, IoT data has an impact on the safety of citizens and important infrastructure. For this reason, proper arrangements must be made regarding integration and accessibility.
Data must be collected and protected from
a wide geographic area with different environmental conditions ( Collected and secure data across a wide geographic area with different environmental conditions):
It is possible to deploy IoT equipment in hundreds of square meters or more Intelligently installed to record ground vibrations, conditions of rails, roads, service stations, and tolls to closely monitor suburban activities.
Data processing should transfer to the best place to find
(Move data to the best place for processing): The best place for data processing, the nearest location to the device that has generated the data
. On-site processing eliminates the problem of data back and forth and allows cloud resources to be used for more important processing.
As mentioned, the traditional cloud computing architecture can not meet all these needs properly, because the traditional model works on the principle that all data must be sent from the edge to the data center for processing, which makes it difficult to increase latency.
Another problem with the traditional method is the limited bandwidth capacity compared to the various devices that intend to send data.
Note that in some sensitive and important industries with which there is a privacy problem, it is not possible to store some sensitive data.
Needless to say, cloud servers only communicate with IPs and not with the large number of protocols used by IoT equipment.
For this reason, the best place to analyze the vast majority of IoT data is near the equipment on which the data is generated and on which they operate.
What is fog processing?
Fog processing has the capacity to place cloud infrastructure implementations and requirements near equipment that generates IoT data and needs to be processed (Figure 1).
The equipment used in this field is called Fog Node. These nodes can establish connections between network equipment at any location, from various parts of the plant to power poles, railways, or oil rigs, and enable local processing.
In the above model, any device with network connectivity, storage space, and processing resources can be a node.
For example, controllers, switches, routers, embedded servers, and CCTV cameras are among these devices.
“Devices that are spatially located near IoT equipment are capable of analyzing about 40 percent of the data,” says IDC. The above approach significantly reduces latency and prevents the transfer of several gigabytes of data; “It wastes network traffic in vain and transmits sensitive data out of the network.”
May programs, like the Internet of Things, have a wide variety of applications. However, features such as monitoring, real-time data analysis of networked objects are present in all of these applications. These programs are capable of performing tasks such as machine-to-machine communication (M2M),
human-machine interaction (HMI), and human-machine interaction. Important applications have also been developed in the petrochemical, oil and gas industries, welfare services, transportation, and government centers. When applying fog calculations, note that the data analysis process must be completed in less than one second.
How do fog calculations work?
Developers design and deliver IoT applications for fog nodes at the edge of the network. Fog nodes near the edge of the network receive data from IoT equipment and direct data, which may be of different types, to appropriate locations for analysis.
Table 1 shows the process of spreading cloud technology to the edge of the network by fog nodes.
Fog nodes near IoT equipment | Aggregation of fog nodes | Cloud | |
Response time | Milliseconds to less than one second | A few seconds to a few minutes | A few minutes, days, or weeks |
Example of programs | Machine-to-machine communication includes telemedicine services and training | Virtualization, Century Analysis | Big data analysis, graphical dashboards |
IoT data storage time | Temporary | Short-term, less than a few hours, days, or weeks | Months or years |
Geographical coverage | Local on an urban scale | Beyond local (urban) scale | Global |
Table 1
In May computing, nodes that are closer to the data generating objects analyze time-sensitive data. For example, in Cisco Smart Node Distribution Network, the emphasis is on control and protection loops working properly.
As a result, fog nodes that are close to network sensors can detect signals that indicate a problem and prevent serious problems by sending control commands to stimuli.
Data that can be processed in seconds or minutes must be sent to the node for processing.
In the case of smart grids, each substation may have its own aggregation point that reports the status of each feeder or feeder.
Non-time-sensitive data is sent to the cloud for time analysis and long-term storage.
In this case, thousands or hundreds of thousands of nodes may be routed to the cloud for data analysis, storage, and archiving.
Advantages of using fog calculations
One of the most important benefits of
cloud computing is Better business agility: Developers can quickly develop cloud computing applications and deliver them to industries using the right tools. Equipment manufacturers can offer MaaS solutions to their customers. Due to the great flexibility of cloud-based applications, it is possible to adapt devices to customer needs.
Better security:
It is possible to protect node-based computing nodes using control approaches and policies. It is also possible to use existing solutions to provide physical and cybersecurity of equipment.
More detailed information regarding privacy control: Instead of sending sensitive data to cloud servers, it is possible to process and analyze data locally.
Because of this, IT teams can control and monitor the equipment that collects, analyzes, and stores data.
Reduce deployment and operational costs: Instead of sending data to the cloud for processing, it is possible to process it locally so that bandwidth is not wasted.
The last word of
May Computing has created conditions for cloud technology so that the cloud can easily manage up to two exabytes of data generated by IoT equipment during the day. Another important advantage of fog computing is the management of large volumes of diverse data that require fast processing.
More precisely, it is possible to use equipment with local processing resources to analyze the data.
All in all, organizations that use cloud computing have access to more accurate and faster information, which leads to agility, improved quality of service, and increased security.