COVID-19 Patient Count Prediction Using LSTM

Document Type

Article

Source of Publication

IEEE Transactions on Computational Social Systems

Publication Date

1-1-2021

Abstract

IEEE In December 2019, a pandemic named COVID-19 broke out in Wuhan, China, and in a few weeks, it spread to more than 200 countries worldwide. Every country infected with the disease started taking necessary measures to stop the spread and provide the best possible medical facilities to infected patients and take precautionary measures to control the spread. As the infection spread was exponential, there arose a need to model infection spread patterns to estimate the patient volume computationally. Such patients' estimation is the key to the necessary actions that local governments may take to counter the spread, control hospital load, and resource allocations. This article has used long short-term memory (LSTM) to predict the volume of COVID-19 patients in Pakistan. LSTM is a particular type of recurrent neural network (RNN) used for classification, prediction, and regression tasks. We have trained the RNN model on Covid-19 data (March 2020 to May 2020) of Pakistan and predict the Covid-19 Percentage of Positive Patients for June 2020. Finally, we have calculated the mean absolute percentage error (MAPE) to find the model's prediction effectiveness on different LSTM units, batch size, and epochs. Predicted patients are also compared with a prediction model for the same duration, and results revealed that the predicted patients' count of the proposed model is much closer to the actual patient count.

Publisher

Institute of Electrical and Electronics Engineers (IEEE)

Disciplines

Medicine and Health Sciences

Keywords

COVID-19, Covid-19, Data models, deep learning, forecasting, Forecasting, long short-term memory (LSTM), pandemics, Predictive models, Recurrent neural networks, risk estimation, short term predictio., Training, Viruses (medical)

Scopus ID

85101735106

Indexed in Scopus

yes

Open Access

yes

Open Access Type

Bronze: This publication is openly available on the publisher’s website but without an open license

Share

COinS