Browsing by Author "Jayasena, K. P. N."
Now showing 1 - 2 of 2
- Results Per Page
- Sort Options
Item Docker incorporation is different from other computer system infrastructures: A review(Department of Industrial Management, Faculty of Science, University of Kelaniya Sri Lanka, 2021) Kithulwatta, W. M. C. J. T.; Jayasena, K. P. N.; Kumara, B. T. G. S.; Rathnayaka, R. M. K. T.Currently the computing world is getting complex, innovating and maturing with modern technologies. Virtualization is one of the old concepts and currently containerization has arrived as an alternative and innovative technology. Docker is the most famous and trending container management technology. Different other container management technologies and virtualization technologies are respective other corresponding technologies and mechanisms for Docker containerization. This research study aims to identify how Docker incorporation is different from other computer system infrastructure technologies in the perspective of architecture, features and qualities. By considering forty-five existing literatures, this research study was conducted. To deliver a structured review process, a thorough review protocol was conducted. By considering four main research questions, the research study was lined up. Ultimately, Docker architecture and Docker components, Docker features, Docker integration with other computing domains and Docker & other computing infrastructures were studied. After synthesizing all the selected research studies, the cream was obtained with plenty of knowledge contribution to the field of computer application deployment and infrastructure.Item Fog computing and IoT based prediction system for healthcare using deep learning methods(Faculty of Science, University of Kelaniya, Sri Lanka, 2020) Buddhika, J. G.; Jayasena, K. P. N.The rapid developments in the information technology arena over the last three decades, have seen a huge range of software application solutions being developed for various sectors of the economy. Cloud computing delivers tools to support businesses over the Internet, efficiently and effectively. However, the key problems currently faced in these cloud architectures are their minimal scalability, low latency, availability, network capacity, stability, and privacy to satisfy the needs of unified computing systems dependent on the Internet of Things (IoT). A modern computational model called Fog computing therefore provides low latency and energy-efficient approach to address cloud computing challenges. The study introduces a new architecture to embed ensemble deep learning in Edge devices and have applied it to enable automated analysis of disease. This architecture provides health care with IoT devices as a fog service and handles heart patient data which is effectively captured as user requests. Further, this architecture gathers the Cloud resources when Fog devises overload and optimize the architecture performance. The proposed architecture was deployed and tested for power usage, network bandwidth, latency, reliability, and timely execution efficiency of the proposed model. We used different edge devices to implement this architecture like a laptop, Raspberry pi, smartphone, and node MCU. Then they are employed as Fog nodes. We used a laptop as a Master Fog node, Raspberry pi as a worker node, and also used node MCU to connect sensors to gather patient data. A smartphone is then used with a simple android application to communicate through the Fog nodes by Rest API. We use a router to create a local network by connecting every node to the router. It was then possible to get sensor data to the android application from the node MCU. Then we can send a job request to the master node. After obtaining the job request, the master node checks the free worker node which is not overloaded and sends the IP address of that worker node back to the mobile application. The mobile application sends the data to the worker node with the IP address received from the master node and gets the predicted result. Then we developed the ensemble deep learning module and implemented it within each worker and the master node. We use Cloud Data Centre to predict the result when the Fog environment becomes overloaded and implement this architecture and evaluate that against only the Cloud environment. The study identified this Cloud-Fog architecture as providing better performance than a simple Cloud architecture.