Using ECG Authentication for Biometrics in Smart Cities

All the biometric systems are based on some important features of these modalities which are mainly known for their uniqueness in one way or the other. However, the automatic attendance system using heart biometrics is focused as it is internally unique and focused on internal flexibility of the heart. Heart biometrics include different authentication modalities such as ECG, SCG, PCG and so on. It is primarily focused on authentication using ECG signals which uses algorithms or techniques such as SVM (support vector machine) for authentication purposes and dynamic time warping for signal matching. Algorithms used have shown high accuracy results and the challenges faced were considerable and effectively managed to improve for further advancements.

system which does not consume much of teachers' time and is also not tiresome for teachers. So, the idea is to make the attendance marking system automatic. It will mark the attendance automatically without the intervention of a third person.
Why should we use heart biometrics? We should use heart biometrics because it is much more efficient than existing biometrics software. Heart biometrics also works efficiently on physically handicapped people (Bras.S,et al.,2018).
The significant usage of HRS is that it can work efficiently on physically handicapped persons. It rules out most of the disadvantages of current biometric systems like inefficiency of face recognition on black color people and inappropriate results on fingerprint recognition under less than -10 degree Celsius.

SCoPE oF THE STUdy
The Automatic Attendance System expands beyond its meaning. It focuses on identification of the real personality of an individual using appropriate algorithms and sensors. This methodology can trace automatic attendance and can be an effective way in the modern era of novel coronavirus. This methodology is independent of whether a person is physically handicapped or not. It provides an equal opportunity to every individual. The above study can be extended way beyond towards ensuring wellbeing of an individual by recording their heart signals and analyzing them.
This methodology can uniquely identify twins which is not possible in face recognition systems. This system is independent of whether fingerprint recognition starts giving errors below 10 degree Celsius. Hence it expands its scope to cold countries like Russia, America, and Greenland etc. Famous hacker Jame Krissler was able to hack this biometric technology with just some photos and fingerprints of an individual whereas our methodology ensures high privacy and less FAR(False acceptance rate).

oBJECTIVE oF THE RESEARCH
• Provides easy means to have attendance for both faculty and student • Reduces manual work by having a much reliable system which uses heart recognition as its means • A person cannot have his/her attendance if they are not in the given acceptable range and also their friends cannot mark his/her attendance because our heart recognition system will uniquely identify each and every individual • This study also unlocks the limitations of existing biometric systems and can be a highly secure system. • It reduces chances of losing your personal information and makes contactless tracing feasible. • It can be used to save time and use them for student benefits. For example:-if it takes 5 minutes for a teacher to have attendance then in a week it took 35 minutes considering 7 lectures per week .If there are 100 faculty then it took 3500 minutes approx. 59 hours it can be significantly reduced to 1 hour by using above methodology.

ToPIC oRGANIZATIoNS
This research paper first tells about the heart recognition system (HRS), why it is used for biometrics purposes and why the signals generated are unique. The author team tells about the various fields in which HRS can be used and why HRS is better than other biometrics. The team has studied six research papers for the literature survey on how the data can be collected, how HRS works and which type of sensors or other instruments will be required for the same. Then how it can be used for biometric authentication. Which is used to make an automatic attendance marking system. All the literary surveys are also represented in the tabular form.
The method used by the author team to check biometrics is through ECG signals which will be recorded of each student and they would be verified with the previously recorded data of ECG signals of the student. The exact location of the student will also be verified and then the attendance of the student will be marked accordingly.
The team will be making an automatic attendance marking system. In this system heart biometrics will be used for biometric purposes. Which will use ECG signals produced by our heart to detect a person and mark his/her attendance accordingly.
The team has tried to automate the attendance marking system using heart biometrics and it has also tried to capture the person's location. To check if the person marking the attendance is at the place of work.
Since the research is done on a small data set it may or may not be 100% accurate. And sometimes the heart rate may change due to a physiological condition which has not been considered in the research.
In this paper we have implemented various ML algorithms on ECG Data Set and compared the results of different algorithm.

RoLE oF AUTHoRS
Mr. Rohit Rastogi acted as our guide in completing this research paper. He structured the content of the paper and provided us with the blueprint. Ms. Ishanki worked on how HRS works, its pros and cons. Ms. Aditi worked on a data set, how is the data set going to map with the signals and how the results will be fetched. Mr. Pallavit worked on how biometrics work and how they are going to measure up the ECG signals along with the localization property of sensors.

INTRodUCTIoN
Biometric Authentication has been achieving a lot of advancements from the recent scenarios. Authentication today has become an important part of our daily life. Traditional methods of authentication such as using passwords or pins can easily be spoofed and thus resulting in accuracy of the system. New methods with reduced ERR(Error Result Rate) were proposed which include modalities such as by using finger-prints which aim for scanning ridges and valleys on finger tips for identification. Similarly, methods such as face recognition, iris detection and other methods were proposed which gave remarkable results and encouraged the study of biometrics for further advancements to deal with issues such as insecurity and scalability.

Heart Beats Recognition: Multi-Facet Applications
Our Heart is mainly known for its pumping mechanism and for transfer of blood. However, recent studies have shown that nowadays, the heart is considered for authentication purposes due to many reasons. This idea of heart to be used as biometric modality came from its features such as the sound of lub-dub produced by heart, its unique shape and size with respect to every individual and the signals of heartbeat. Different researches based on biosignals such as EEG (Electroencephalogram), ECG (Electrocardiogram), PCG (Phonocardiogram), PPG (Photoplethysmogram) and EMG (Electromyography) have been performed to increase the efficiency for biometrics. The study has shown that DNA or RNA proteins which are responsible for providing the biological information of a person are considered to be unique in every individual which are found in the organs such as heart and brain. The cells which contain these proteins in the heart are called 'myocardium'. Hence, the electrical signals extracted from these organs are considered to be unique for every individual (Singh, Y.N. et al., 2012)(As per Fig. 1).

The world Scenario for Using Heart Biometrics in Various Applications
Heart biometrics can be used for high security applications due to its features of liveness detection which results in real time authentication. Many applications on heart biometrics have been developed. The use of wireless technology in the defense field has been proved highly advantageous in the real time scenarios of war. Non-contact technologies are highly in demand of the defend systems as some of them have shown remarkable results such as use of 'Microwave Doppler Radar' which is preferred for detecting mechanical displacements of heart of the person or the victim which is used for extracting unique features of the heart. Other examples include 'NASA'S FINDER' technology which focuses on detection of heart beat or heart signal of the person hidden behind piles of rocks, woods, walls and debris .It can detect the person buried even under 10 feet .Similarly, Ultrasonic pulse detection utilizes the Doppler's effect for detecting the relative motion of blood flow in the human body thus helping even in sensing the mental conditions of the person which is quite helpful in certain stressful conditions (Morese,Frank. 2015) (As per Fig.2).

AI, ML, and Big data Usage in Smart City development and 21st Century Lifestyle
Smart cities are the ones in which a large amount of data is collected through various sensors. Then the collected data is analyzed and is used to address real life problems and to automate various tasks.
The Big data that is collected is used in various fields such as transport, security, cost reduction, sustainability, etc. For example, the collected data is analyzed and that can be used in places which are prone to theft and crime can be made more secure. Big data analysis can be used to regulate traffic signals. This will also help in making sustainable development (Seneviratne S., et al., 2017).  Url(https://www.hindawi.com/journals/isrn/2012/712032/) But analyzing this large amount of data and making decisions on the basis of this is a very hectic procedure. Here comes the role of AI, ML. This modern technology can easily find the relation between various data sets and analyze it. For example, with the help of machine learning a large amount of data is fed into the system and which can be easily analyzed to solve problems. With the help of AI the development can also be made sustainable (Chowdhury, M. et al., 2021)(As per Fig 3).

Various Approaches of Biometric Systems and Betterment of HRS
Biometric has been in use since a long time for authentication purposes at various places. It is majorly divided into two ways physiological and behavioral. Source: (https://www.esds.co.in/blog/how-ai-and-machine-learning-help-build-a-smart-city/)

Figure 2. NASA'S finder radar
Physiological is majorly extracted from human body parts for example, iris, fingerprint etc. whereas behavioral is from the person's activity that he/she performs in a particular way.
But most of these biometrics can be faked, so they are less authenticated or can be less relied on. So, if we use HRS they are unique to every individual and cannot be faked. ECG signals are unique because of the position, size etc. of the heart. They are measured as a result of the electrical signals generated by our heart. These signals can also be taken regularly without any break so we can use it in places where continuous recognition is required. Along with this they can be only taken for living things (Lee, W. et al, 2018); (Silva, H. et al., 2013) (As per Fig. 4).

Knowledge Management and Knowledge Extraction in Human Identity Systems
Knowledge Management is basically a methodology by which all the important details regarding an organization are collected, maintained and used for the organization's welfare. Knowledge Management in Human Identity Systems aims for managing the information effectively ensuring the privacy of the candidate.
Knowledge Extraction is a process in which we can create knowledge from structured and unstructured sources. Structured sources include relational databases and XML whereas unstructured sources are images,text, documents etc.
The DIKW model or DIKW pyramid is often used for knowledge management and data value extraction. This model explains how we move from data to information, then to knowledge, and then to wisdom through a series of actions and decisions.
This model also tells us about various ways to extract useful insights and values from all sorts of data like big data, small data, fast data, and many more. Jennifer Rowley has mapped the DIKW model to different types of information management systems.
According to Jennifer Rowley Data is related to transaction processing systems.Information with information management systems Knowledge with decision support systems. Wisdom with expert systems (Matos, G. et al., 2007)(As per Fig. 5 and 6).

Knowledge Impact of HRS on Social Life and Accuracy Standards
Smart ECG cards are being used for authentication purposes.Apart from this remote sensing of CM from radar, PPG signals from cameras, and HRV from laser Doppler vibrometry is leading to monitoring of a suspect's heart conditions (Ribeiro Pinto J., et al., 2018).
Different types of cardiac signals are being used for Psychophysiological lie detection for precise heart rate and cardiac output measurement to identify any abnormal changes in the cardiac activity or blood flow.

LITERATURE REVIEw
Literature Review is a summary of all the scholarly research papers or articles that we read on our topic. In literature review we analyze the topic, see what are the areas in which we lack and what are

Source: https://www.i-scoop.eu/wp-content/uploads/2016/07/DIKW-through-the-eyes-of-IoTcompany-AGT-as-mentioned-on-Electronics-360.gif.webp
the future possibilities of the topic. It also tells about the authors of the particular paper that we read where it was published and we give citations in it too (Hale M.L., et al., 2019). The geolocation API is used to find the geolocation of the person. This API is a connection between the client and the server. This API tells about the time zone and location of the person. This API does not store any information itself but it tracks the information through sources such as "IP address, Bluetooth, RFID, etc." The API as an input first requests the client to access the location of the person. Then it collects the information through "IP address, Bluetooth, RFID, etc." After that the API tells the information in the form of "physical address i.e. city, country,longitude,latitude etc., IP address and ISP detail, carrier, local data etc." The places where this API can be used are Tracking of IOT devices, cybersecurity, weather report etc. The first property of the API is "getcurrentposition()" this is used to get the location of the person it stores the latitude and longitude and accurate position of the object. We can pass by argument for the "desired amount of accuracy" we want of the person. The second property is "watchposition()" which locates the position of the persons and the position is changed as soon as the location changes. The third property is to stop tracking the person's position; it is done through "clearwatch()" (Geolocation API, PubNub).
Ingale,M. and his team in this research paper has done research on ECG and its database. As all other traditional methods used are very much time consuming and prone to spoofing attacks. So, many people have shifted their attention to the electrocardiogram (ECG). Due to their uniqueness they can be used for biometrics. But due to the lack of a database much research cannot be done in this field. So, the team has tried to find a new tool to gather databases.
Filtering the database, doing feature extraction with is done using two methods i.e. "Algorithms based on handcrafted features" and "algorithm based on non-handcraft method fiducial features". The first one can be further divided into "fiducial and non-fiducial" methods. In this method it deals with the preparation of ECG signals. Where as in the "non-handcraft method" deep learning algorithms have been used. But this method is less practical because it requires a large amount of data to train the model and along with this it also requires large computational size.
Then "classification category" is used. In this biometrics is considered as "ECG verification" or "ECG identification". In the first type the computation is done by matching the vector, etc. whereas in the second case it is done through deep learning. In "pre-processing" noise is removed from the ECG signal. This is done using two methods "Kalman filter" and "Infinite impulse response (IIR) filter. All of this is followed by "Segmentation", "Feature Extraction", "Matching" of the data (Ingale, M., et al., 2020). Debnath, B. and team discussed that the need of the hour for biometric authentication arose from the many of the cyber security issues, hack, privacy loss and many more. Biometric authentication works upon the idea of measuring/ scanning a person's unique physiological part and then mapping that result with the data set and verifying whether the person's identity is real or not (Jain A.K., et al., 2016).
Many biometric authentication methods exist but the oldest one is fingerprint authentication. Optical Fingerprint Readers are being used nowadays. They work on the principle of changes in reflection of light when we put our fingers on the surface of an optical fingerprint reader. Ultrasound fingerprint readers are the latest technology and are being widely used. What happens here is that we use ultrasound to keep an eye over finger surfaces. As soon as the candidate keeps their finger over the surface ultrasound has some sensors. They start moving and read the whole finger print.
Some of the conclusions drawn on fingerprint authentication is that it doesn't produce satisfactory results when a person has wet or dry fingers. Also it doesn't give accurate results when temperature is below minus 10 degree Celsius. Upon some studies it was found that it had 2% of "EER(Equal Error Rate)"2%of "FAR(False Accept Rate)" 2% of "FRR(False Rejection Rate)". The future work on fingerprint recognition is still to be done to make it more accurate (Resque P., 2019).
Debnath, B. and team found that Face Recognition Authentication is based on the idea of identifying a person from his/her face or image source or video source. It is one of the most popular technologies for authentication purposes. This technology has its two parts one is Facial Metric and other is Eigen Faces. Facial Metric focuses on finding the position of eyes, ears, nose etc. and finding the distances between them.
Eigen Faces method is based on categorizing faces according to the degree of it and matching it with other 100 or 150 Eigen faces. There will be some patterns to find out the size of ears, nose and eyes. They work efficiently under hairlines, beards etc.
Firstly in the methodology the human verified his official ID and a certain set of algorithms verified its authenticity. Then the user takes his selfie and a 3D Face map is used for getting more data of the user. Now the user can use face recognition as a security way according to his needs. But still the technology lacks in a lot of areas as it is not able to distinguish between people having the same faces. Some changes in hair or beard also create hindrance for proper recognition. Localization Algorithms need more optimization. Also the system doesn't give accurate results for people with black skin color. It has an FRR (False Rejection Rate) of 10% (Debnath B., et al., 2009).
One of the biometric modalities from the field of authentication is based on iris. This method of authentication is based on image based authentication which demands for security and easy login access.The main steps of iris recognition focuses on image acquisition followed by pre-processing then accompanied by iris segmentation,then iris normalization. The subsystems of the acquisition process include illuminators, lens, sensors and control unit. The database of iris recognition is easily available worldwide. Most of the database is captured through near-infrared spectrum.Iris images captured usually suffer many artifacts like noise, illumination reflections, blurring, gazing and occlusions from eyelashes. Several methods are recorded in the literature to remove these artifacts. Removing the artifacts through pre-processing will enhance the recognition accuracy.The other modalities of biometric authentication also include ear lobes which are highly recommended for the purpose of attendance security,crime investigation (Winston, J.J., et al.,2019).
An electrocardiograph is a set up to demonstrate the ECG signals of the person .ECG signals are the electrical impulses generated by the heart due to muscle contraction during its cardiac cycle, throughout the body. It is also used for detection of different heart diseases, cardiac abnormalities and so on. It is measured by attaching electrodes at different surfaces of the skin for diagnosis purposes. The waveform resulted, is basically the result of different specialized cells in the heart.During the cardiac cycle, the depolarization of atria produces ECG wave particularly P wave whereas the ventricular contraction is represented by QRS wave then the last phase of it is represented by T complex or wave, in some case U complex is also found. The above stated components involved in the cardiac cycle, as a whole, contribute to the ECG signal which is also known as P-QRS-T complexes which vary among the individuals due to different shape and size of the heart, thus ensuring uniqueness.Hence, ECG have been considered for biometric authentication .Thus, biometric authentication using ECG involves:feature extractions, cl assification,segmentation,filtering and matching (Roodposhti, P.S. et al., 2022).
The biometric authentication firstly involves sensing mechanism which is followed by preprocessing. The registered sample or template is compared with the ones collected. The methodologies used for feature extraction in case of ECG signals involve either using handcrafted or non-handcrafted methods. The handcrafted methods basically involve two other subcategories of fiducial and non-fiducial features. The fiducial features involve considering the amplitude or time differences of the peak of complexes such as Q and T whereas the non-fiducial extraction involves the statistical analysis of the ECG signal by using different algorithms for studying on the basis of time or frequency.The methods for non-handcrafted extraction uses deep learning which provides robustness and is advantageous over the handcrafted methods as in those methods,the noise removal methods are also taken into consideration with optimization tasks thus reducing the performance. RR-segmentation is used for segmentation purposes. Dynamic Time Warping is used for matching the registered template with the collected ones (Ingale, M. et al.,2020).
Pl. refer Table 3 for comparative summary of papers.

METHodoLoGy ANd SETUP ANd dESIGN oF EXPERIMENT
Name of Algorithms Used

(Cascaded Digital Filters),(Smooth Filters)
Preprocessing for low frequency is mainly preferred by Kalman filter & IIR filtering for high frequency. Cascaded digital filters configuration is used as it removes three major types of noises such as EMG noise, Baseline Drift and Power Line Interference. Smooth filtering is used later on denoised signal for capturing important traits of the signal.

(RR-segmentation),(Time Domain Analysis)
This is basically employed for feature extraction,there are different phases of feature extraction which first involves P-QRS-T complexes detection which is done by detecting R-peak and which is taken later as a reference for detection of other waveforms by using Time Domain Analysis.It is used as it involves extraction of whole waveform for segmentation rather than partial detection which is done in the method of fixed length segmentation .It results in optimal average EER(Equal Error Rate of 2%).

(Dynamic Time Warping)
This method is basically preferred as it was used by many researchers and it was tested over 100 subjects which resulted in higher accuracy of 99.4% than other algorithms such as Euclidean Distance Algorithm.

Open Database.
It is a type of database that is available publicly for all the users. It can be used, modified also.

Restricted Database
In this database users can access the database with some restrictions.

Credentialed Database
These databases contain information specifically authentication information called credentials as these are used to connect to resources outside of our servers. The site talks about the working of the geolocation API. All the commands that are used to make use of the API.
It has use an API which tell the location of the person this is called "Geolocation API" The site has told about how geolocation API works and how it can be used 2

ECG Biometric Authentication: A Comparative Analysis
The papers give an in depth knowledge on why ECG can be used as a biometric. It is a relatively new technology so the ways to collect data base so that it can be used for recognition The algorithm used are "Algorithms based on handcrafted features" and "algorithm based on non-handcraft method fiducial features" In this they have suggested a method to collect the data base The team has proposed a method of finding ECG database though it's not that accurate due to noise etc., They have also tried to increase the database which was present till now. So now more research can be done on this. Databases will be used 'Off the Person' and will be created using MySQL.

Hardware Requirement
• ECG sensor/electrodes -The electrodes which are used for collecting ECG signals are composed of Ag/AgCl. Which is also surrounded by conducting gel. • Wearable or handy Device -Wearable or handy devices are considered so that the person marking the attendance will be able to mark his/her attendance from anywhere.

Software Requirement
MP36 Systems are used for data acquisition at a sampling frequency of 1000Hz. The BIOPAC MP36R systems are four channel data acquisition systems useful for collecting data related to life science research. It is used for complete data acquisition and data analysis. It is used with AcqKnowledge software in combination with electrodes, transducers and other system components.
High Pass filters and Low Pass filters are used for noise filtration between 1Hz and 35Hz frequencies.

Network Requirement
In biometrics we require a "TCP/IP" network In this TCP stands for "Transmission control protocol" whereas IP stands for "Internet protocol". This consists of four layers.The first is the "Link layer" which is used as a communication tool so that the "data packet" can be transferred from one place to another. Then the second layer is of IP which is required for internal network communication. The third layer is of TCP which is required for the communication between the device containing the "ECG sensors" and servers. Then the last layer is the "application layer" which is used for "processing" which is done while processing the data.
The research project will work in a distributed client server environment.

oS Requirement
Windows is used as an operating system. Processor used: Intel(R) Core(TM) i3-6006U CPU @ 2.00GHz 2.00 GHz RAM used: 8GB System Type: 64-bit operating system, x64-based processor • Database Requirement • ECG ID Database: The database contains 310 ECG recordings, obtained from 90 persons. Each recording contains: ECG lead I, recorded for 20 seconds, digitized at 500 Hz with 12-bit resolution over a nominal ±10 mV range; 10 annotated beats (unaudited R-and T-wave peaks annotations from an automated detector) • MIT-BIH Database: The MIT-BIH arrhythmia database is publicly available dataset which provides standard investigation material for the detection of heart arrhythmia. Since 1980, it is used for the purpose of fundamental research and medical device development on cardiac rhythm and related diseases. • PTB Database: The ECGs in this collection were obtained using a non-commercial, PTB prototype recorder. • CYBHi Database: The Check Your Biosignals Here initiative (CYBHi) was developed as a way of creating a dataset and consistently repeatable acquisition framework, to further extend research in electrocardiographic (ECG) biometrics. The database used in this project is measured using the BIOPAC MP36 data acquisition system. URL:(https://physionet.org/content/cebsdb/1.0.0/) Memory Size 2.7 GB The purpose of the database was twofold: 1. To check if slight errors in the detection of the RR time series when measuring it using two different leads are influenced by breathing. 2. To compare the RR time series obtained from the ECG and its surrogate measure obtained from the seismocardiogram or SCG (and to optimize beat detectors for SCG).
To construct the database, 20 presumed healthy volunteers were measured. Information of the subjects is summarized in the info.txt file. During the measurement, the subjects were asked to be very still in supine position on a comfortable conventional single bed and awake. After attachment of sensors, we recorded the basal state of the subjects by measuring during 5 minutes (records b001 to b020) After that, the subjects started to listening classical music during approximately 50 minutes (records m001 to m020) Finally, we monitored all subjects 5 minutes more after the music ended (records p001 to p020).
Data was acquired using a Biopac MP36 data acquisition system (Santa Barbara, CA, USA). Channels 1 and 2 of the system were devoted to measure conventional ECG (leads I and II respectively) with a bandwidth between 0.05 Hz and 150 Hz, channel 3 was employed to measure the respiratory signal obtained from a thoracic piezoresistive band (SS5LB sensor by Biopac, Santa Barbara, CA, USA) with a bandwidth of 0.05 Hz to 10 Hzand channel 4 was devoted to acquire the SCG using a triaxial accelerometer (LIS344ALH, ST Microelectronics) and a bandwidth between 0.5 Hz and 100 Hz. For the ECG measurement we used monitoring electrodes with foam tape and sticky gel (3M Red Dot 2560). Each channel was sampled at 5 kHz.
URL:(https://physionet.org/content/cebsdb/1.0.0/) Meta data tells about the other data information and how it is associated with each other.

Storage Requirement
The ECG record of the person can be stored on his/her device itself. Whereas the attendance record of the person can be stored on the cloud or on file servers.

Front End
• HTML, CSS, JavaScript • Sensors will be used to record ECG signals.

Back End
• It will consist of Node JS, MongoDb as a database and Express JS.
• Machine learning algorithms and NLP algorithms will be used for the evaluation process.
• The core programming of sensor verification will be done in back end Steps of Execution 1. All working • ECG Biometric System is basically employed through two major phases which are enrollment phase and authentication phase. • In the enrollment phase, the user's ECG signal is registered to generate the template by using sensors. • In the authentication phase, raw input from the user is collected and then after going through several processes discussed below, it is compared with the registered template. • The authentication phase involves filtering the raw signal by using Kalman and IIR filters. • Filtering is followed by segmentation which involves finding repeating patterns in the ECG signal which are also known as P, QRS, and T complexes. It is done by using RR-Segmentation and Fixed-Length Segmentation algorithms. • Feature extraction is done by using Fiducial or Non-Fiducial methods. Fiducial feature extraction involves using the fiducial points such as P, QRS, T waves to find their relative amplitudes and morphological features for accurate detection. • On the other hand, Non-Fiducial methods involve the study of other statistical features by analyzing time or frequency analysis. • Matching of the above sample with the stored template is done by using Euclidean Distance or Dynamic Time Warping.

2.DataBase Working
• At the time of enrollment Raw ECG signals will be stored in the database.
• From here three types of data will go in the other three databases. One will contain data regarding low frequency signals. Others will contain information regarding high frequency. • The third one database will have filtered ECG.
• At the time of feature extraction database containing subjects name, 3. Sensor Working ECG sensors or "mobile based ECG monitoring device" are used to measure the electrical activity of the heart. They detect the impulse created by the heart. This is done through the electrode present on the wearable device.
These sensors are used to record the ECG signal of the person wearing the device. The signals will be further processed for the biometric purpose.

Flow Chart
The above flow chart demonstrates the processes involved in the execution of this project with the algorithms used in each step. The first process involves sensing which is done by taking raw input from the user also known as data acquisition. Sensing is followed by preprocessing which basically involves filtering or denoising using different filters such as Kalman Filters, IIR filters, Cascaded Filters and Smooth filters for different kinds of noises respectively. Feature Extraction involves the process of segmentation as well which is done by RR-segmentation followed by Time domain analysis. Matching of the signal is done by using Dynamic Time Warping (as per Figure 7).

Block / Process diagram
It tells us about how the data flow is going within the same. The database containing feature details will be first matched with the user and data present in the database. Then features will be unlocked and it will be matched and then the result as true or false will be present (As per Fig. 8).

ER diagram
The above defines the ER diagram where the entities and their relationships have been described (as per Figure 9).  The diagram gives the first level of introspection on level-0 DFD where the major components have been displayed (as per Figure 11). The diagram gives the first level of introspection on level-2 DFD where the major components have been displayed (as per Figure 13).

Use Case diagram
Firstly the admin will have to access his dashboard so as to proceed for further processes. Also a time record of user log session will be maintained simultaneously and also with this user credential will be made secure and available. These requirements will be used to validate admin (as per Fig. 14).

RESULTS
The author team found the following graphs after experimenting on their datasets. They used ECG-ID dataset for implementation and applied the Python framework to find the results. Some interesting patterns have been obtained.
Using ECG-ID and MIT-BIH databases, the ECG record and its Rpeak positions were obtained, as shown in Figure 15. For the ECG-ID database, three samples were taken from 90 people. The first record was used for training, while the second record was used for validation. There were ten annotated Rpeak is shown in above Figure for each person.
As shown in Figure 16, ECG records from the ECG-ID database have been preprocessed to remove three types of noises, i.e., baseline drift, powerline noise, and high-frequency noise. Meanwhile, Figure shows that the e_ect of baseline drift and powerline noise is still visible in the ECG signal.  Figure 17 shows the confusion matrix for a single validation beat, i.e., the first beat of eight or six consecutive beats. Figure 5b,d shows the confusion matrix for multiple validation beats. From above Figure, it can be shown that 100% accuracy can be obtained for the multiple validation beats, with six beats and eight beats for the ECG-ID and MIT-BIH databases, respectively. The optimum number of multiple beats will be evaluated in the next experiment. ECG beats produced better accuracy compared to a single ECG beat.

NoVELTIES
• Till date all the research papers related to automatic attendance marking system the team has read are either made using face recognition or fingerprint biometrics. • In this research the team has tried to do biometric authentication using ECG signals generated by our heart. • ECG authentication can also be used by people who are physically handicapped. • The application which will be used to mark the attendance will also have a special feature which will track the location of the person marking the attendance • This project will provide a common platform for both students as well as teachers to mark the attendance. • Timing limit will also be there within which the student will have to mark the attendance.

RECoMMENdATIoNS
This product should be recommended because it overcomes the drawback of the existing biometric systems as it can easily authenticate a physically handicapped person. It reduces the proxy system and reduces the chances of authenticating a fake personality. It also has a low FRR (False Rejection Rate) and FAR(False Acceptance Rate). It can automatically track the attendance of the people who are within the acceptance range. Since here we are using ECG signals hence it can be used for diagnosing purposes.

Limitations
• Some algorithms have been used to maintain the length of ecg waves but physiological heart rate can be changed in ECG due to various conditions and we haven't considered those conditions. • This study has been done on a small set of data so it needs to be studied at large data sets to get more efficient results. • There are some risks of loss of personal information if the system is hacked.

Future directions
• One needs to consider larger data set for getting more and more of accurate results • Deep learning techniques and models can be for normalization procedures. • Accuracy in the position of a person has to be more accurately measured like for very very small values we have to check our results.

CoNCLUSIoN
• A system which can work on real time authentication is implemented.
• In this research paper, the idea of using heart biometrics came into existence as there existed a lot many problems with the traditional ones which were stated above. • Localized algorithms make it easier to easily track the people and automatically mark their attendance which saves a lot of time. • This product can also be used beyond the attendance marking system as it can be used to give access to specific people for some highly confidential information.

ACKNowLEdGEMENTS
Firstly we pay respect to the almighty. We also would like to express our gratitude towards our parents, teachers and friends who have helped us complete this research paper. We offer our sincere thanks to ABES Engineering College for giving us the chance to write this research paper. We also would like to extend our thanks to the management of ABES EC, Director of ABES EC, HoD of CSE Department for their support. We are extremely thankful to our guide Dr. Rohit Rastogi for his continuous support and valuable advice he has given us from time to time. Lastly we would like to thank all the people who have been involved directly or indirectly to publish this research paper.

CoMPETING INTERESTS
We declare, that we have no significant competing financial, professional, or personal interests that might have influenced the performance or presentation of the work described in this manuscript.

ETHICAL CoMMITTEE ANd FUNdING
The experiments don't include any human related experiments and so no ethical constraints have been violated. Though the subjects performing the study were humans and air quality directly affects them but the study doesn't violate any health related measures. The Project is not funded by any agency.
To construct the database, 20 presumed healthy volunteers were measured. Information of the subjects is summarized in the info.txt file. During the measurement, the subjects were asked to be very still in supine position on a comfortable conventional single bed and awake. After attachment of sensors, we recorded the basal state of the subjects by measuring during 5 minutes (records b001 to b020) After that, the subjects started to listening classical music during approximately 50 minutes (records m001 to m020) Finally, we monitored all subjects 5 minutes more after the music ended (records p001 to p020).
Data was acquired using a Biopac MP36 data acquisition system (Santa Barbara, CA, USA). Channels 1 and 2 of the system were devoted to measure conventional ECG (leads I and II respectively) with a bandwidth between 0.05 Hz and 150 Hz, channel 3 was employed to measure the respiratory signal obtained from a thoracic piezoresistive band (SS5LB sensor by Biopac, Santa Barbara, CA, USA) with a bandwidth of 0.05 Hz to 10 Hzand channel 4 was devoted to acquire the SCG using a triaxial accelerometer (LIS344ALH, ST Microelectronics) and a bandwidth between 0.5 Hz and 100 Hz. For the ECG measurement we used monitoring electrodes with foam tape and sticky gel (3M Red Dot 2560). Each channel was sampled at 5 kHz. Aditi Mittal is an engineering student in AKTU Univ. Presently, she is pursuing Bachelors of technology in CSE department from ABES Engineering College Ghaziabad, India. She has keen interest in coding and using technology for the betterment of society. Her hobbies are drawing, dancing and reading books. She is a very curious and also ambitious person. Her goal is to serve society so that the world can be a better place to live.
Ishanki Verma is an engineering student in AKTU Univ. Presently, she is pursuing Bachelors of Technology in CSE department from ABES Engineering College Ghaziabad, India. She has keen interest in problem solving and exploring fields such as space, science, spirituality and psychology. Her hobbies are singing and exploring. She likes to dive deep into the things which make her achieve anything she wants to. Her short term goal is to be a passionate software engineer and long term goal is to join ISRO and to make an impact on society.
Pallavit Saxena is an engineering student in AKTU Univ. Presently, He is pursuing Btech in CSE from ABES Engineering College Ghaziabad, India, and currently he is in his Second Year. He has keen interest in coding and solving real life problems. His hobbies are playing badminton and studying psychology. He has a young and alluring personality. His short term goal is to get into one of the top tech giants of IT Industry and long term is to be Financially Independent and make contributions towards society.