An experimental study of fog and cloud computing in CEP-based Real-Time IoT applications Full Text

Edge computing – commonly referred to as simply “the edge” – allows for data to be processed closer to its origination which can significantly reduce network latency. By physically bringing processing closer to the data source , there’s less distance that data needs to be sent across, improving the speed and performance of devices and applications. However, there are limitations with things like real-time analysis and machine learning that can be achieved with fog computing.

fog computing vs cloud computing

It is a model for enabling ubiquitous, on-demand access to a shared pool of configurable computing, storage, and networking resources. Cloud computing, storage, and networking solutions provide users and enterprises with various capabilities to store and process their data in third-party data centers. MCC emerged with the proliferation of smart mobile devices 3/4/5G and ubiquitously accessible WiFi networks, and it was originally promoted by enabling cloud computing applications for mobile devices.

But first… cloud computing

If these measurements are sent to the cloud every second , the data will pile up to a massive amount. This is because it allows data to stay on-device, requiring less contact with public cloud networks and platforms. The word ‘fog’ in fog computing is a metaphor since fog is defined as clouds close to the ground.

These computations are then passed back down the computation stack so that it can be used by human operators and to facilitate machine-to-machine communications and machine learning. Fog computing is a computing architecture in which a series of nodes receives data from IoT devices in real time. These nodes perform real-time processing of the data that they receive, with millisecond response time. The nodes periodically send analytical summary information to the cloud. A cloud-based application then analyzes the data that has been received from the various nodes with the goal of providing actionable insight.

On the one hand, in the case of fog computing (see Fig.5a), we can see that the edge level will perform all the data processing while the core level will only work for the storage of the information. More deeply, in every Fog Node of the edge level a CEP and Broker are deployed for the Local Events generation. Moreover, there are several alternate open-source frameworks for distributed stream processing, which exhibit different performance and are best suited to different use cases. A comparative evaluation can be found in Nasiri et al. , focusing on the most popular ones . According to this study, Apache Flink is able to provide capability to run real time data processing pipelines in a fault-tolerant way at a scale of millions of tuples per second. CEP is a technology that allows to ingest, analyze and correlate a large amount of heterogeneous data with the aim of detecting relevant situations in a particular domain .

The similarities between Edge and fog computing

This is a prime example of edge computing, as all inputs and processing takes place on the edge device, which can be a gaming console, personal computer, or smartphone. As this form of gaming is highly sensitive to latency, only the metadata from the game session is transmitted to the cloud for processing. Provided the connections between the edge devices and the cloud server are stable, the outcomes of the actions of all players are displayed in real-time.

The new technology is likely to have the biggest impact on the development of IoT, embedded AI, and 5G solutions, as they, like never before, demand agility and seamless connections. According to Statista, by 2020, there will be 30 billion IoT devices worldwide, and by 2025 this number will exceed 75 billion connected things. It increases cost savings as workloads can be transferred from one Cloud to another cloud platform. Fog computing uses different protocols and standards, so the risk of failure is very low. Fog does short-term edge analysis due to the immediate response, while Cloud aims for a deeper, longer-term analysis due to a slower response.

fog computing vs cloud computing

The pervasive IoT applications are managed by resource virtualization through fog, cloud, and mobile computing. Resource virtualization is dealt with by cloud computing and brings some challenging tasks related to resource management. We understand fog computing as the calculation layer between the cloud and the edge.

Title:Fog Computing Vs. Cloud Computing

While fog computing has some advantages over cloud computing, it is not likely to replace it entirely. Fog computing is more efficient because data is processed closer fog computing vs cloud computing to the source, which reduces latency. It is also more secure because data does not have to travel as far and is, therefore, less likely to be intercepted.

  • Therefore, a difference in both flows lies first in the location of the CEP module for event detection and the Broker for subscription.
  • Irrelevant data might be sent to the cloud in addition to the useful information that’s actually needed.
  • There are some key differences in terms of where these services are actually located.
  • The fog architecture is distributed and consists of millions of small nodes located as close as possible to the client device.

Increased collaboration – Cloud-based solutions make it easy for employees to collaborate on projects in real-time from any location. Improved disaster recovery- Cloud providers offer comprehensive disaster recovery solutions that can help businesses recover from major disruptions quickly and efficiently. Improved user experience — instant responses and no downtimes satisfy users. Unfortunately, there is nothing immaculate, and cloud technology has some downsides, especially for the Internet of Things services. PaaS — a development platform with tools and components for creating, testing and launching applications. Power-efficiency – Edge nodes run power-efficient protocols such as Bluetooth, Zigbee, or Z-Wave.

Cloud computing relies heavily on centralized servers that are located far away from users, which can lead to slower response times and lag. In contrast, Fog computing distributes resources much more locally, effectively bringing the processing power closer to the user. Thus, Jalali et al. carry out a comparative study between Data Centers with cloud computing architecture and Nano Data Center with fog computing, the latter being implemented with Raspberry Pis. The performance of the two architectures is evaluated considering different aspects but always focused on energy consumption.


In fact, Gartner projects that 75% of enterprise data will be generated outside of centralized systems by 2025. Such a situation could lead to tremendous strain on both local networks and the internet at large. In fact, these two technologies work with each other to add value through data. In edge networks, cloud computing is often dedicated to completing tasks that require more computing power, such as large-scale artificial intelligence and machine learning operations.

fog computing vs cloud computing

Finally, not only latency is important to evaluate in both architectures. The distribution of computational resources in the different architectures must also be assessed. Since the 4G telephony network has stable results and good latency performance, this will be the network used to send alarms to Final User in the remaining experiments. In addition, and as we will see in this section, this latency study should be extended so that we can compare if latency is reduced with the generation of Local Events , rather than Global Events . Equation 1 has been used to calculate total latency (see “Latency analysis” section). The CEP engine is also implemented at both levels of the proposed architecture.

Fog computing

Once the data is processed, the output is transmitted back to the endpoint. While edge computing brings the computers closer to the source of data, cloud computing makes advanced technology available over the internet for a fixed, recurring fee. Cloud computing uses a network of remote servers instead of a local server or personal computer to store, manage and process data. It has many benefits – not only does it allow companies to outsource their storage capability, freeing up physical space at their offices, but it’s also more secure than storing data locally.

What are the disadvantages of Fog Computing?

Taking into account this evaluation set out in the literature, the actual load of this architecture has been evaluated in our work, but specifically in real-time IoT applications. For these types of applications in IoT, two important and critical architecture components emerge, to be integrated into both the edge nodes and the cloud, these are, the CEP technology and the MQTT protocol. Also require the processing of large volumes of real-time data to allow for efficient management. The sensors and other edge devices used in these applications are numerous and greatly dispersed. Therefore, fog computing is used to process data concurrently without compromising response time. Fog is processed and stored at the edge of the network closer to the source of information, which is important for real-time control.

Fog Computing: principles, architectures, and applications

Fog computing takes advantage of new technologies like the Internet of Things and allows data from connected devices like sensors or smart home appliances to be processed locally instead of being transmitted back to central servers. For example, on the data plane, fog computing enables computing services to reside at the edge of the network as opposed to servers in a data-center. The fog computing paradigm can be simply defined as a natural extension of the cloud computing paradigm. In the literature, there exist related terms, such as edge computing or mist computing. There is not a standard criteria about the layered architecture of fog computing and there are different approaches .

F fog computing works similarly to cloud computing to meet the growing demand for IoT solutions. In this chapter, we introduced a reference architecture for IoT and discussed ongoing efforts in the academia and industry to enable the fog-computing vision. Many challenges still remain though, with issues ranging from security to resource and energy-usage minimization. Open protocols and architectures are also other topics for future research that will make fog computing more attractive for end users. It can be seen that when cloud computing is used, CPU consumption is at most 1% higher than in fog computing, which is a very insignificant increase.

Edge Clouds – Pushing the Boundary of Mobile Clouds

As such, when considering the pros and cons of cloud vs fog computing, the question of location awareness becomes an important factor to consider. Overall, while both cloud and fog computing have their respective advantages, it is important to carefully consider which model is best suited for your particular needs. In addition to providing fast and easy access to information, cloud computing also allows for real-time collaboration among individuals and organizations.

For this, several tests are carried out such as static web page loads, applications with dynamic content and video surveillance, and static multimedia loading for videos on demand. Some of the conditions that were worked on were variants in the type of the access network, the idle-active time of the nodes, number of downloads per user, etc. Moreover, the authors determine that under most conditions the fog computing platform shows favourable indicators in energy reduction. Hence, the authors conclude that in order to take advantage of the benefits of fog computing, the applications whose execution on this platform have an efficient consumption of energy throughout the system must be identified. Thus, the model known as cloud computing, executor of interconnectivity and execution in IoT, faces new challenges and limits in its expansion process. These limits have been given in recent years due to the development of wireless networks, mobile devices and computer paradigms that have resulted in the introduction of a large amount of information and communication-assisted services .

By 2020, there will be 30 billion IoT devices worldwide, and in 2025, the number will exceed 75 billion connected things, according to Statista. All these devices will produce huge amounts of data that will have to be processed quickly and in a sustainable way. To meet the growing demand for IoT solutions, fog computing comes into action on par with cloud computing. The purpose of this article is to compare fog vs. cloud and tell you more about fog vs cloud computing possibilities, as well as their pros and cons. The demand for information is increasing the overall networking channels. And to deal with this, services like fog computing and cloud computing are used to quickly manage and disseminate data to the end of the users.

Data analysis over large periods of time should be deployed at resources placed in the cloud level. Embedded hardware obtains data from on-site IIoT devices and passes it to the fog layer. Pertinent data is then passed to the cloud layer, which is typically in a different geographical location. The cloud layer is thus able to benefit from IIoT devices by receiving their data through the other layers. Organizations often achieve superior results by integrating a cloud platform with on-site fog networks or edge devices.