Svoboda | Graniru | BBC Russia | Golosameriki | Facebook
Skip to main content
Editor IJATCA

    Editor IJATCA

    Fca, Publication, Department Member
    This paper aims to perform satellite image processing using Machine Learning models and evaluate their prediction scores. This research will classify satellite images into four distinct categories, namely "green area," "desert," "water,"... more
    This paper aims to perform satellite image processing using Machine Learning models and evaluate their prediction scores. This research will classify satellite images into four distinct categories, namely "green area," "desert," "water," and "cloudy, and will train and evaluateseveral criteria to highlight the model's exceptional performance, including precision, recall, F1-scores, and total accuracy. The model exhibits exceptional accuracy in correctly predicting and identifying positive instances, as seen by the near-perfect scores achieved for precision and recall across most classes. The F1 scores demonstrate a cohesive equilibrium across various measures, indicating the approach's efficacy. Significantly, the model has a remarkable overall accuracy rate of 99%, emphasizing its proficiency in precise picture categorization. The use of macro and weighted averages highlights the resilience and uniformity of its performance, irrespective of variations in class distribution. The findings presented in this study provide evidence supporting the appropriateness of the model for a range of applications, with a particular emphasis on computer vision and machine learning. Evaluation measures like accuracy, recall, and F1-score provide a detailed analysis of the model's capabilities, rendering them essential for assessing classification models.
    Smart hospitals, utilizing advanced technologies, seek to transform patient care, make processes more efficient, and better use resources. Artificial Intelligence (AI) is pivotal in transitioning traditional health centers into smart,... more
    Smart hospitals, utilizing advanced technologies, seek to transform patient care, make processes more efficient, and better use resources. Artificial Intelligence (AI) is pivotal in transitioning traditional health centers into smart, adaptive environments. This article delves into how AI is used in such hospitals, emphasizing its role in elevating patient care, streamlining operations, and championing a patient-focused model. AI in these settings covers areas like medical imaging, diagnostics, predictive insights, patient interaction, and aiding clinical decisions. For instance, AI tools for diagnosis have shown impressive precision in pinpointing issues quickly through various imaging techniques. Predictive tools help track disease trends, streamline clinical tasks, and predict potential future hospital visits, leading to more tailored patient care. Additionally, AI promotes patient involvement via tools like virtual aides, chatbots, and distant health monitoring, enabling people to have more control over their health. Merging AI with clinical decision-making tools supports medical professionals in making informed decisions, leading to better patient results. However, using AI in this context also brings forth challenges related to data security, potential biases, regulatory adherence, and the necessity for crossdisciplinary teamwork. This article underscores the need to tackle these hurdles for an ethical and accountable application of AI in health environments. To conclude, infusing AI into smart hospitals can significantly reshape healthcare, leading to more personalized, data-informed, and efficient patient care. As AI progresses, its union with human expertise is set to usher in a new intelligent healthcare era, promising better patient experiences, improved results, and ultimately, a healthier global community.
    The primary aim of this research paper was to determine optimal threshold values for vital parameters in different medicinal batches, guaranteeing high standards of quality and safety. To accomplish this objective, the study harnessed the... more
    The primary aim of this research paper was to determine optimal threshold values for vital parameters in different medicinal batches, guaranteeing high standards of quality and safety. To accomplish this objective, the study harnessed the combined potential of state-of-the-art machine learning algorithms, linear regression models, rigorous statistical techniques like ETL and Airflow. Most common algorithms in both statistics and machine learning is linear regression. By leveraging these advanced data processing methodologies, the research looks forward to enhance the pharmaceutical industry's ability to assess and maintain the quality and safety of medicines effectively. The ETL process starts with extracting data from Hive, that offers efficient storage and processing capabilities, making it an ideal source for data extraction. The extracted data is then transformed using ML and data analysis techniques. The transformation logic is implemented using Jupyter Notebooks, it provides an interactive environment for developing and executing code, making it easy to apply ML algorithms and data manipulation techniques. After the data has been transformed it is loaded back into PostgreSQL, a powerful and scalable relational database management system that provides robust data storage and querying capabilities, making it an ideal destination for the transformed data. The loaded data is organized within PostgreSQL tables. The transformed data stored in PostgreSQL can then be used by the final product, which could be a web application, a reporting dashboard, or any other system that requires access to the process an enriched data. These tools enabled for formation of threshold values for parameters of different medicines with high accuracy by efficient data processing, analysis, and visualization, allowing users to make data-driven decisions and gain insights from the transformed data.
    The rapid pace of technological advancement has driven the emergence of innovative payment platforms catering to consumers' demands for adaptable, user-friendly, cost-effective, and time-efficient transaction solutions. Among these... more
    The rapid pace of technological advancement has driven the emergence of innovative payment platforms catering to consumers' demands for adaptable, user-friendly, cost-effective, and time-efficient transaction solutions. Among these cutting-edge innovations, decentralized digital currencies, commonly known as cryptocurrencies, along with their underlying blockchain technology, have emerged as highly promising disruptors in the financial landscape. This comprehensive review article delves into the intricate effects of cryptocurrencies as a decentralized form of money on contemporary society. Leveraging the inherent accessibility and cost-efficiency of cryptocurrencies, these digital assets empower marginalized populations, facilitating seamless cross-border remittances and expanding financial opportunities. Nonetheless, the volatile nature of cryptocurrency prices and the prevalence of fraudulent activities have sparked concerns among investors and regulators. The lack of robust regulatory oversight and the veil of anonymity shrouding transactions have rendered cryptocurrencies susceptible to scams and money laundering, necessitating unwavering vigilance and innovative strategies to combat these issues. Additionally, the energy-intensive nature of cryptocurrency mining has sparked environmental apprehensions, underscoring the imperative of adopting sustainable practices in the industry. Prudent management of digital assets and the implementation of robust security measures are vital in safeguarding users from potential risks, including fund loss resulting from accidents or hacking incidents. The future of cryptocurrencies promises further advancements, with various countries exploring the concept of Central Bank Digital Currencies (CBDCs) to enhance payment systems and financial services. Scalable solutions, such as layer 2 protocols and sharding techniques, hold the potential to address scalability challenges and optimize transaction speeds. Moreover, the convergence of Artificial Intelligence (AI) and cryptocurrencies opens up an exciting frontier where AI models and strategies may unveil novel insights and approaches in cryptocurrency markets. While cryptocurrencies present promising opportunities for financial inclusion and innovation, they also pose challenges and risks that demand careful consideration. By embracing innovation, nurturing financial inclusion, and proactively addressing potential risks, the future of cryptocurrencies is poised for sustained growth and evolution, shaping the way we interact with money and finance in the years to come.
    Today, the integration of Fin-tech with IoT and Artificial Intelligence is rapidly challenging banks. FinTech offers fast support and enhanced convenience, making it highly desirable for customers. This article explores the active and... more
    Today, the integration of Fin-tech with IoT and Artificial Intelligence is rapidly challenging banks. FinTech offers fast support and enhanced convenience, making it highly desirable for customers. This article explores the active and prominent areas of FinTech, including Cryptocurrency and digital cash, Smart contracts, Open banking, Block-chain technology, Reg-tech, Insurtech, Unbanked services, Robo-advisors, and Crowdfunding. The paper presents a cohesive research analysis based on a critical evaluation of the literature. It also provides a comprehensive review of the history of FinTech and its various domains. Technologies such as Machine Learning, AI, and predictive analytics have a direct impact on overall business policies, revenue generation, and resource optimization. To summarize, FinTech is rapidly merging with IoT and AI, posing a significant challenge to traditional banks. The key features of FinTech, including fast support and improved convenience, are highly attractive to customers. The areas classified under FinTech, such as Cryptocurrency, Smart contracts, Open banking, and more, are explored in this article. Additionally, the paper offers a critical assessment of the literature, presenting research themes and a historical overview of FinTech. The integration of technologies like Machine Learning and AI in financial services has profound implications for business strategies, revenue, and resource management.
    Research Interests:
    Sentiment analysis, a subfield of natural language processing (NLP), involves the automated identification and classification of sentiment or opinion expressed in text. Traditionally, sentiment analysis has focused on English language... more
    Sentiment analysis, a subfield of natural language processing (NLP), involves the automated identification and classification of sentiment or opinion expressed in text. Traditionally, sentiment analysis has focused on English language texts, but with the increasing availability of multilingual data on social media, online reviews, and news articles, there is a growing demand for sentiment analysis in multiple languages. Analyzing sentiment in multiple languages presents unique challenges due to linguistic differences, cultural nuances, and the availability of labeled data. This paper provides an analysis of features based machine learning approaches used for sentiment analysis in multiple languages. It discusses the challenges and considerations specific to multilingual sentiment analysis and provides insights into the performance and effectiveness of different machine learning models. The goal is to explore the performance, effectiveness, and generalization capabilities of different machine learning models across diverse linguistic contexts.
    Interoperability stands as a significant challenge for the healthcare industry in the digital era. Electronic health records (EHRs) encompassing vital patient information such as medical history, lab tests, demographics, medications,... more
    Interoperability stands as a significant challenge for the healthcare industry in the digital era. Electronic health records (EHRs) encompassing vital patient information such as medical history, lab tests, demographics, medications, allergies, immunization records, radiology images, and vital signs are confined within isolated databases, incompatible systems, and proprietary software. This predicament poses substantial barriers to data exchange, analysis, and interpretation. Recognizing the need for improved access, analysis, and communication between healthcare systems, medical devices, and applications at both local and crossorganizational levels, the healthcare sector seeks a solution. Application Programming Interface (API) integration has emerged as the preferred method to facilitate data flow between internal applications, EHRs, and other data exchange tools within the healthcare industry. APIs enable a secure and seamless exchange of data and functionalities, making them a vital component in managing the data exchange process. This paper emphasizes the indispensability of interoperability for future medical advancements. APIs are the most useful tool among many developing technologies, such as IoT, SaaS, and cloud computing, for maximizing performance, boosting revenue, and bettering consumer comprehension. The paper highlights the advantages of APIs and proposes an API-led integration framework to enhance the interoperability of patient health information among healthcare organizations while ensuring data privacy and security.
    Heart failure is a serious cardiovascular condition that affects millions of people worldwide and poses a significant burden on healthcare systems. Early detection and prediction of heart failure can significantly improve patient outcomes... more
    Heart failure is a serious cardiovascular condition that affects millions of people worldwide and poses a significant burden on healthcare systems. Early detection and prediction of heart failure can significantly improve patient outcomes by enabling timely intervention and management. In recent years, machine learning techniques have emerged as powerful tools for developing predictive models in healthcare. This abstract presents a heart failure prediction system that utilizes machine learning algorithms to identify individuals at risk of developing heart failure. The system incorporates various features such as demographic information, medical history, vital signs, and laboratory test results to build a predictive model. Data preprocessing techniques are applied to handle missing values, normalize the data, and address data imbalances. The selected machine learning algorithm undergoes training and validation using a large dataset of heart failure cases. The model's performance is evaluated based on accuracy, sensitivity, specificity, and area under the ROC curve. The system's user-friendly interface allows healthcare professionals to input patient data, view the prediction results, and make informed decisions regarding patient care. The implementation of the heart failure prediction system involves the use of modern tools and technologies such as Scikit-Learn, TensorFlow, and Keras for algorithm selection and model development. Data storage and retrieval are handled using a relational database management system such as MySQL. Privacy and ethical considerations are addressed through robust data protection measures and compliance with relevant regulations. The evaluation and results analysis demonstrate the system's effectiveness in predicting heart failure cases with high accuracy and sensitivity. A comparison with existing prediction systems highlights the system's competitive performance and its potential to enhance early detection and intervention. In conclusion, the heart failure prediction system presented in this abstract offers a valuable tool for healthcare professionals in identifying individuals at risk of heart failure. The system's implementation, evaluation, and comparison with existing approaches contribute to the growing body of knowledge in the field. Future work could focus on enhancing the system's interpretability, generalizability, and integration with real-time monitoring devices for continuous heart failure risk assessment.
    Blood vessels are important biomarkers in skin lesions both diagnostically and clinically. Detection and quantification of cutaneous blood vessels provide critical information towards lesion diagnosis and assessment. In this paper, a... more
    Blood vessels are important biomarkers in skin lesions both diagnostically and clinically. Detection and quantification of cutaneous blood vessels provide critical information towards lesion diagnosis and assessment. In this paper, a novel framework for detection and segmentation of cutaneous vasculature from dermoscopy images is presented and the further extracted vascular features are explored for skin cancer classification. Given a dermoscopy image, we segment vascular structures of the lesion by first decomposing the image using independent component analysis into melanin and hemoglobin components. This eliminates the effect of pigmentation on the visibility of blood vessels. Using k-means clustering, the hemoglobin component is then clustered into normal, pigmented and erythema regions. Shape filters are then applied to the erythema cluster at different scales. A vessel mask is generated as a result of global thresholding. The segmentation sensitivity and specificity of 90% and...
    The main issue in image forensics is to discover whether an image is authentic or forged and, if forged, to locate which regions have been manipulated. The simple accessibility of image manipulation software have proliferated the... more
    The main issue in image forensics is to discover whether an image is authentic or forged and, if forged, to locate which regions have been manipulated. The simple accessibility of image manipulation software have proliferated the possibility of image forgery. Detection of Splicing forgery is targeted in this paper. Noise component of a color image have been utilized to extract features from suspected image. As the consistency of noise between RGB color channel of forged and authentic image are different, so it leaves the clues of forgery. First digit features are extracted using Benfords’ law and provided to the SVM classifier Columbia uncompressed image splicing detection evaluation dataset and CASIA v1.0 dataset are used to test the proposed technique. Our technique outperforms various previous techniques of image splicing detection.
    International Journal of Advanced Trends in Computer Applications (IJATCA) extensively cover research work with cutting edge forefront innovations and adequate promoting methods. The group of researchers and academicians who are the part... more
    International Journal of Advanced Trends in Computer Applications (IJATCA) extensively cover research work with cutting edge forefront innovations and adequate promoting methods. The group of researchers and academicians who are the part of International Journal of Advanced Trends in Computer Applications contribute to provide assistance in reviewing the manuscript and enhance that by giving an aide for composing highly quality research paper through proper evaluations.

    International Journal of Advanced Trends in Computer Applications is a half yearly, peer-reviewed international research e-journal that addresses both applied and theoretical issues. The scope of the journal encompasses research articles, original research reports, reviews, short communications and scientific commentaries in the fields of computer science and engineering and other related areas. The journal addresses the issues for the vertical and horizontal applications in their respective areas.

    The aim of IJATCA is to provide an international forum for the publication and dissemination of original work that contributes to the understanding of the main and related disciplines of engineering, either empirical or theoretical. It also publish peer reviewed research and review articles in rapidly developing field of computer science engineering and technology and provides a venue for high-caliber researchers, PhD students and professionals to submit on-going research and developments. It is an international scientific journal that aims to contribute to the constant scientific research and training, so as to promote research in the field of computer science. This journal is an e-journal having full access to the research and review papers.
    A MANET can be defined as an autonomous system of nodes or MSs(also serving as routers) connected by wireless links, the union of which forms a communication network modeled in the form of an arbitrary communication graph. The self... more
    A MANET can be defined as an autonomous system of nodes or MSs(also serving as routers) connected by wireless links, the union of which forms a communication network modeled in the form of an arbitrary communication graph. The self configuring means that any mobile nodes can join or leave the network when they want .It is the decentralized type of network in which mobile nodes can move from one location to another. Due to random movability of the mobile nodes, the two factors route establishment, route maintenance becomes the major problem of MANET networks. This main Spotlight of this research paper is the route establishment & route maintenance which are properties of MANET network. The EETC protocol is the route establishment and route maintenance protocol in which broker route will be recovered on the basis of node connectivity. The node, which has maximum connectivity, is selected as the best node for route recovery in EETC Protocol. In this research work, the EETC protocol is further improved by adding buffer size parameter for route recovery which also maintains & improves quality of service like better throughtput, Less Energy Consumption, High Packet Delivery Ratio , Low End to end delay and Less Packet loss & Less Overhead in the network  The proposed IEETC protocol simulation results perform well as compared to existing EETC protocol in terms of certain parameters.
    Mobile adhoc network are flexi-mobile, they use wireless connections to connect to various networks. An ad-hoc network is a collection of wireless mobile hosts forming a network without central epicentre. The automated factor means that... more
    Mobile adhoc network are flexi-mobile, they use wireless connections to connect to various networks. An ad-hoc network is a collection of wireless mobile hosts forming a network without central epicentre. The automated factor means that any mobile nodes can join or leave the network at any point of time when they want which causes many prob like Qos parameters is affected The EETS is the improved version of AODV protocol for path recovery in mobile adhoc networks. In the EETS protocol, when the mobile node change its location then link failure occurred in the network. The EETS protocols works on the node connectivity factor for the link recovery. When any node detects link failure in the network, then the node with which maximum number of nodes is connected is selected as the best node for link recovery. The EETS protocol performs well in terms of certain parameters but for the link recovery it donot include quality of service parameters. In this research work, improvement in the EETS protocol will be proposed by applying quality of service parameters for path recovery & Better Link Stability in mobile adhoc networks. The quality of service parameters are like Overhead, Energy consumption and delay.
    Differential Evolution (DE) is an evolutionary optimization technique that is very simple, fast, and robust at numerical optimization. It has mainly three advantages; finding the true global minimum regardless of the initial parameter... more
    Differential Evolution (DE) is an evolutionary optimization technique that is very simple, fast, and robust at numerical optimization. It has mainly three advantages; finding the true global minimum regardless of the initial parameter values, fast convergence, and using few control parameters. The main advantage of the DE over other methods is its stability. DE algorithm is a population based algorithm like genetic algorithms using similar operators; crossover, mutation and selection. DE becomes impressive because of the parameters; crossover ratio (CR) and mutation factor (F) do not require the same tuning which is necessary in many other Evolutionary Algorithms. In the present study, DE has been used to solve the two chemical engineering problems from the literature. The comparison is made with some other well-known conventional and non-conventional optimization methods. From the results, it was observed that the convergence speed of DE is significantly better than the other techniques. Therefore, DE algorithm seems to be a promising approach for engineering optimization problems.
    Skin cancer accounts to be a standout amongst the most prevalent types of carcinoma ailments, particularly among Caucasian offspring and pale-skinned persons. Specifically, the melanocytic dermis lesion are conjectured as the most lethal... more
    Skin cancer accounts to be a standout amongst the most prevalent types of carcinoma ailments, particularly among Caucasian offspring and pale-skinned persons. Specifically, the melanocytic dermis lesion are conjectured as the most lethal among three pervasive skin carcinoma ailments and the second most communal type amongst youthful grown-ups who are 15-29 years old. These apprehensions have impelled the requirement of automated systems for the diagnosis of skin carcinomas within a limited time frame to reduce unnecessary biopsy, proliferating the momentum of diagnosis and giving reproducibility of indicative outcomes. In this survey paper a brief overview of automated detection and segmentation of vascular structures of skin lesions is presented
    Organisations and individuals have embraced new practices such as social distancing and remote working as a result of the COVID-19 virus which was declared a pandemic by the World Health Organisation. Cyber criminals around the world... more
    Organisations and individuals have embraced new practices such as social distancing and remote working as a result of the COVID-19 virus which was declared a pandemic by the World Health Organisation. Cyber criminals around the world capitalised on the crisis while the entire world focused on the pandemic, in particular the health and economic threats posed by COVID-19.Humans have always been the weakest link in cybersecurity, they are either carrying out the attack or the target of the attack. Individuals are increasingly the targets of two types of attacks: social engineering which seeks to circumvent an existing process and exposes an individual's lack of security awareness, while exposing obsolete/vulnerable software in a system or technology are the targets of logical engineering. Recent trends, according to cybersecurity statistics and which are also side effects of the global pandemic, reveal a huge increase in hacked and breached data from sources that are increasingly common in the workplace, like mobile and smart devices. Additionally, remote workforce greatly increased, giving room for cyberattacks. This paper concludes that the pandemic introduced a new variation in cyber-attacks which majorly focused on; security risk from remote working/learning, malicious websites, mobile threats, malicious social media messaging and business email compromise, which might lead to the chances of downloading adware, spyware, ransomware and other malicious software. This paper also recommends that proper awareness on the new gimmicks of cybercriminals is essential inextinguishing the chances of a cyber-attack and possible information breach.
    Service-Oriented Architecture is a technique which can be employed to unite various services over the operating systems, platforms and networks. Several organizations fail to completely use SOA and the reason behind this is underdeveloped... more
    Service-Oriented Architecture is a technique which can be employed to unite various services over the operating systems, platforms and networks. Several organizations fail to completely use SOA and the reason behind this is underdeveloped adoption process. The author has conducted an exploratory study to explore the recent concerns and numerous practices related with SOA adoption along with assessing various maturity levels used, role of information technology in SOA adoption. The required information is gathered by conducting a literature survey that explored the previously done work on SOA adoption by surfing through the net, reading journals and papers. The paper focuses on various significant issues related with adoption of SOA in organizations.
    All over the world, the COVID-19 health crisis has had an impact on the teaching-learning process. Since the closure of schools and the suspension of face-to-face classes, preventive and urgent actions have been put in place to guarantee... more
    All over the world, the COVID-19 health crisis has had an impact on the teaching-learning process. Since the closure of schools and the suspension of face-to-face classes, preventive and urgent actions have been put in place to guarantee pedagogical continuity. In Morocco, from September 7, 2020, the Ministry of National Education has declared the adaptation of distance education as a pedagogical form, with the possibility of covering face-to-face teaching, while allowing parents of students to choose the mode of teaching that suits their children. Faced with the experience of distance learning and technical problems related to internet access and the inadequacy of digital tools experienced the school year 2019-2020 most parents have chosen the face-to-face mode. Faced with this situation and in order to preserve the health security of students, the Ministry has implemented a teaching mode that consists of alternating groups of students in each class to attend face-to-face classes. From then on, this change emerged from new evaluation practices. The main objective of this study is to analyze the evaluation practices of school mathematics learning in Morocco during the COVID-19 pandemic, and to identify the different forms taken by formative evaluation in such circumstances. For this purpose, we opted for a qualitative approach based on a content analysis through the analysis of the continuous tests of 10 teachers of mathematics of the qualifying secondary cycle, addressed to their students of the scientific common core level during the first semester of the school year 2020-2021; as well as the realization of an interview with the same teachers, in order to explain the obtained results. The results demonstrate a certain deviation from the formative evaluation process, as well as a decline in the evaluation practices considered in COVID-19.
    A distributed denial-of-service (DDoS) attack is one of the most powerful weapons on the internet. Research indicates that several works been have done to mitigate DDoS attacks on Linux based Servers. However, the type of DDoS attack... more
    A distributed denial-of-service (DDoS) attack is one of the most powerful weapons on the internet. Research indicates that several works been have done to mitigate DDoS attacks on Linux based Servers. However, the type of DDoS attack covered were mostly HTTP Get Flood attacks on port 80 and 443. More so, the IPTables firewall rules used were not automated using Bash scripts to make it portable and the firewall rules in most cases were written to mitigate attacks coming from a single IP address. This study will therefore expand the scope of the mitigating DDoS attacks using IPTables to include TCP SYN Flood attacks, UDP Flood attacks and PING (ICMP) Flood attacks. After carrying out the test when the BASH scripts have been executed, DDoS attacks in form of TCP SYN Flood, UDP Flood and ICMP (Ping) Flood were generated using HPing3 and they were successfully mitigated as the Linux Server dropped packets that make up these attacks while allowing legitimate traffic and users to access resources on the Server.
    The main aim of enhancement of an image is to apply some operation on the image in spatial or frequency domain so that the resultant image is more appropriate than the original image for a particular application. In this paper a method to... more
    The main aim of enhancement of an image is to apply some operation on the image in spatial or frequency domain so that the resultant image is more appropriate than the original image for a particular application. In this paper a method to enhance low contrast images in spatial domain is proposed. The method first applies power law to correct the image tone and then applies improved histogram equalization to reduce the problem of noise amplification as in histogram equalization. The performance is measured both quantitatively and qualitatively and the results reveal that method is better than the state-of-the-art and can be applied to many image processing and computer vision application.
    Road traffic injuries are one of the highest public health hazards and in order to bring down the mishaps, one should be well aware of the road safety rules. Most highway users ignore road safety rules probably because they believe it is... more
    Road traffic injuries are one of the highest public health hazards and in order to bring down the mishaps, one should be well aware of the road safety rules. Most highway users ignore road safety rules probably because they believe it is perfectly okay to violate them, or they get a feeling of accomplishment from being able to violate them and not get caught. The helmet is the main safety equipment for motorcyclists, but many riders do not use it. In order to enhance the enforcement of obeying road safety rule of wearing a suitable helmet for every motorcycle user on the highway, an automated system that captures highway users' state and report defaulters to appropriate authorities is important. Deep learning method known as transfer learning with Convolutional Neural Networks can be highly efficient in expediting detection of helmet on the highway.
    In this modern era, the misuse of computers and tape recorders results in the widespread use of telephones, cell phones and tape recorders. This makes them an effective tool for the commission of criminal offences, where certain forms of... more
    In this modern era, the misuse of computers and tape recorders results in the widespread use of telephones, cell phones and tape recorders. This makes them an effective tool for the commission of criminal offences, where certain forms of crime are often used by criminals. Any criminal minded individuals might use some strategies and tricks to conceal the voice, assuming they would remain undercover, that some would not identify them. The misuse of voice can be evaluated by using the individualization character of audio. The manipulation or alteration of the speech of individuals is regarded as voice disguise. It can be done intentionally or non-intentionally. Luckily, it is not so easy. Because everyone has their own different and unique voice. By examining the parameters like pitch, frequency, way of talking, focusing on vowels can help to identify the disguised voice produced by someone by changing their voice. Pitch shift is a common method of mask introduced by perpetrators When comparing forensic voices, there is a lot of variation in acoustic properties leads to weaker detection in speaker performance. In most cases, the perpetrator tries to cover his voice until an anonymous or diverse call is sent. That is why it is necessary, before naming a speaker, to research the possibilities for covering the voice. The review paper deals with the studies reported on to analyse and compare genuine with disguised voices using various techniques and softwares.
    Nowadays everyone has the ability to capture, store and transfer the digital image. In forensics, unlike other physical evidences, image shows the authenticity of real objects and scene. Sometimes in criminal activities photographs are... more
    Nowadays everyone has the ability to capture, store and transfer the digital image. In forensics, unlike other physical evidences, image shows the authenticity of real objects and scene. Sometimes in criminal activities photographs are taken by any witness but due to their blurred appearance they become neglected as evidence. These days many sophisticated image-processing software's are available that has led not only to increase the cases of manipulated or faked photographs but also emerged as problem solver to the similar images. Various softwares which works as image enhancement improves the quality of digital image that can help to retrieve the useful information from blurred image. The current paper deals with review of literature reported related to image enhancement or de-blurring of digital images of both types i.e. motion blur and out-of-focus blur using Adobe Photoshop Software.
    Path planning has been one of the major research challenges in a Mobile Robot navigation system. Researchers in this area have recorded significant success. However, there are still research issues. Ant Colony Algorithm (ACO), an... more
    Path planning has been one of the major research challenges in a Mobile Robot navigation system. Researchers in this area have recorded significant success. However, there are still research issues. Ant Colony Algorithm (ACO), an intelligent optimization Algorithm has recorded successes in many domains including robot control. In this work, ACO was used to find solutions to the robot routing problem. Simulation results established that the proposed algorithm outperformed the min-max ant system technique.
    Another technique for building and tracking networks has been created by software-defined networking (SDN), but it has also updated the assault surface formed by the organization.SDN offers numerous designs that permit straightforward... more
    Another technique for building and tracking networks has been created by software-defined networking (SDN), but it has also updated the assault surface formed by the organization.SDN offers numerous designs that permit straightforward moderation of particular sorts of assaults, for example, DoS, and permit further work to alleviate certain assaults. In any case, SDN regularly presents new imperfections that are absent in customary organizations, for example, a nonappearance of correspondence between the control plane and the information plane. A few new advancements and strategies have been recommended to conquer shortcomings in SDN security and some extra work may likewise be applied to fix them. Current SDN work explores many measurable patterns that contribute to the state of SDN technology implementation. Because Open Flow is SDN's most common implementation is currently being used in production environments, and IOT of research has been done to use and develop the protocol. There is anyway another exploration pattern that has work that is for the most part pertinent to SDN, including designs that give more adaptability than Open Flow. The expected study will probably follow these patterns by enhancing Open Flow protocol and suggesting more general alternatives, and this research will include further development of network design testing tools and research into Open Flow enhancements when used in production environments. This work presents a study survey review of current SDN security research and other work done in the field of SDNs that is relevant to security and a forecast of future SDN security research directions.
    Size reduction mechanism in real life data sets are very important and an essential factor in healthcare based machine learning (ML) analysis due to high dimension in nature. ML based feature selection aims in determining a minimal... more
    Size reduction mechanism in real life data sets are very important and an essential factor in healthcare based machine learning (ML) analysis due to high dimension in nature. ML based feature selection aims in determining a minimal feature subset from a problem domain while retaining a suitably high accuracy in representing the original features. Rough Set (RS) theory provides the mechanism of discovery of data dependencies in the data set and the novel reducts facilitates, the reduction of the number of conditional attributes and the set of associated objects contained in a dataset in preserving the information of the original dataset. The process use the data alone and does not need any additional information. This paper presents the fundamental concepts of RS and Tolerance RS approaches and adapts the related feature selection for two relevant healthcare applications. Firstly, the TRS based feature selection method is used in latest developments of three medical dataset classification analysis, secondly the method is used in Chest X-Ray image analysis for nCOVID19 diagnose or test classifications and non-invasive thermal imaging process to detect inflammation and vascular dysfunction for sensitive screening of nCOVID19 cases.
    With advancements in medical field, technology is being applied for ingenious applications that let doctors to use them as tools in beneficial way. Examining dental radiographs by dentists usually fritter away time and also error prone... more
    With advancements in medical field, technology is being applied for ingenious applications that let doctors to use them as tools in beneficial way. Examining dental radiographs by dentists usually fritter away time
    and also error prone due to its complex structure. The idea is to analyze dental radiographs in easier way by applying neural networks and transfer learning techniques. These intelligence techniques assist for precise results. The novelty is to apply neural networks on those x-rays to analyze them with aid of transfer learning models. For this, radiograph is taken as input for building model using weights and models in transfer learning. Various architectural models from transfer learning are applied for training x-ray data that yields accurate results. Among applied models, MobileNet architecture with some neural network layers gave error-free results. This x-ray analysis will segregate radiographs having caries and gives output as probability value. This application allows dentists for quick and easier outcome of dental x-rays that rescues time.
    Energy prediction of appliances requires identifying and predicting individual appliance energy consumption when combined in a closed chain environment. This experiment aims to provide insight into reducing energy consumption by... more
    Energy prediction of appliances requires identifying and predicting individual appliance energy consumption when combined in a closed chain environment. This experiment aims to provide insight into reducing energy consumption by identifying trends and appliances involved. The proposed model tries to formalize such an approach using a time series forecasting- based process that considers the correlation between
    different appliances. The entire work has been conducted in two parts. The first part highlights and identifies the energy consumption trends. The second part focuses on the comparison and analysis of different algorithms. The main objective is to understand which algorithm provides a better result in predicting energy consumption. A
    comparison of algorithms for appliance usage prediction using identification and direct consumption reading is presented in this paper. The work is presented on real data taken from the REMODECE database, which comprises 19,735 instances with 29 attributes. The data records the energy for 10 minutes over about 4.5 months.
    The use of games and games technology has been in use in recent times to investigate their possible impact in teaching abstracts concepts. The underlying hypothesis is that the motivating qualities of games may be harnessed and embedded... more
    The use of games and games technology has been in use in recent times to investigate their possible
    impact in teaching abstracts concepts. The underlying hypothesis is that the motivating qualities of games may be
    harnessed and embedded in a game-based learning system (GLS) to accelerate the comprehension of abstract
    concepts. In this paper, an AntHill Invader Game (AIG) framework is presented to guide in the design of gamified
    Ant social collaborative behaviour in an Ant Hill Colony. A detailed workflow illustration and mathematical
    model for the use of AIG in the development of GLS is presented in this paper.
    In the feature extraction methods, the features have noise like spikes or ridges and gaps. These affect the accuracy. This paper aims to remove the connected noise (ridges) and gap filling in between the two points of roads extracted by... more
    In the feature extraction methods, the features have noise like spikes or ridges and gaps. These affect the accuracy. This paper aims to remove the connected noise (ridges) and gap filling in between the two points of roads extracted by an index Normalized Difference Asphalt Road Index (NDARI).The extracted roads from the methodology have connected noise like spikes. This connected noise can remove or smooth by a Subspace Constrained Mean Shift (SCMS) algorithm is a non-parametric and Circle Window method. By using these methods, roads (line) features smoothed and shown in the experimental results.
    In this paper explores on compression of Aadhaar number storage through the concepts of reduced bit level ordering. The Aadhaar number is an unique number used in various government schemes. It is a twelve digit number. The population of... more
    In this paper explores on compression of Aadhaar number storage through the concepts of reduced bit level ordering. The Aadhaar number is an unique number used in various government schemes. It is a twelve digit number. The population of our country is more than 135 crores. It means more than 135 crores of Aadhaar numbers will be available. The uniqueness of beneficiary is ensured by avoiding duplicate beneficiary. The uniqueness needs Aadhaar number comparison with minimal amount of time. Hence there is a need for reducing the Aadhaar storage for minimizing the search time. The numbers are represented by using various number representations with their needed storage. The compression of Aadhaar storage implemented by using the concept of bit level ordering which takes sorted integer as an input. The expected saved space shown in graphs and tables. The input from various bit level taken and output obtained as index bit and content bit. The obtained saved space depicted through table and graphs.
    The object of the present research paper is to develop Artificial Neural Network Simulation and by using five and three independent π-term from five independent pi terms. (Aspect ratio, aggregate-cement ratio, water-cement ratio,... more
    The object of the present research paper is to develop Artificial Neural Network Simulation and by using five and three independent π-term from five independent pi terms. (Aspect ratio, aggregate-cement ratio, water-cement ratio, percentage of fibre and control strength)) for prediction of SFRC strength. From three independent pie terms control strength, percentage of fibre and Aspect ratio. The output of this network can be evaluated by comparing it with experimental strength and the predicted ANN simulation strength. The study becomes more fruitful when most influencing π-term is calculated for the prediction of SFRC strength. The beauty of both the models is that we can predict compressive strength, flexural strength and split tensile strength by using same model.
    Cloud Computing (CC) is a model that allows shared and configurable computing resources positioned in the cloud with little management effort from the CloudServicesProviders (CSP). However, security, reliability, cost, virtualization,... more
    Cloud Computing (CC) is a model that allows shared and configurable computing resources positioned in the cloud with little management effort from the CloudServicesProviders (CSP). However, security, reliability, cost, virtualization, need, on demand service, maintenance, integration, user friendliness legislation and regulations are top priority in the adoption of cloud computing concepts with security being the most important. This paper highlights the challenges of adoption of cloud computing paradigm, examines threats in cloud computing as highlighted by several authors with proposed solutions. The burdens on Cloud Services Providers and the future of this technological shift are also discussed.
    The recent influx in the deployment of cloud computing can be attributed to large, medium, small enterprises and individuals' quest to decrease IT cost and overcome economic recession. However, the cloud is still faced with challenges... more
    The recent influx in the deployment of cloud computing can be attributed to large, medium, small enterprises and individuals' quest to decrease IT cost and overcome economic recession. However, the cloud is still faced with challenges such as data breaches, data loss, malicious insider and denial of service attacks and all point to security of the cloud. This makes security an important discuss in cloud computing, which led to the objectives of this research, which are toevaluate security protocols or measures employed by major cloud service providers and offer recommendations to both Cloud Service Providers (CSPs) and the user.The research employed comparing techniques to evaluate the security measures and protocols employed by the top three cloud service providers which are Microsoft, Amazon and Google. It is worthy to note that, the compute service of CSPs are claimed to be secured as well as their storage services. It is recommended that CSPs provide a 2-factor authentication for users, this ensures that cloud users use very strong password or keys. While cloud users should ensure proper cloud logging and authentication, they should also avoid weak and generic passwords, as weak and generic password makes the cloud susceptible to breaches.
    Innovative advancements in the Telecommunication sector in form of 5G and 6G technologies have raised the issues pertaining to human health impacts by the electromagnetic waves (EMW) delivered by cell phone base transmitter stations... more
    Innovative advancements in the Telecommunication sector in form of 5G and 6G technologies have raised the issues pertaining to human health impacts by the electromagnetic waves (EMW) delivered by cell phone base transmitter stations (BTS). Present paper highlights the health issues, which might arise from EMW from the cell phone BTS. Because of super-fast speed and potential of lower latency of connectivity even in the far-flung areas, 5G and 6G technology will be exclusively relied upon for implementation in different industrial fields, research and development in the near future and will further satisfy the thorough need for bandwidth through positioning of huge number of heavily dense sited base stations working in the millimeter-wave extend. Presentation of new emission sources, working in correspondence to previously existing 2G/3G/4G versatile innovations, raises worries about surpassing the acceptable EMF exposure limits. Due to these services, cell phone clients and individuals living inside short proximity of the cell phone base stations have become progressively worried over the possible harmful effects of radiofrequency radiation generated by these devices to their health The way that this radiation is concealed, vague, and enters and leaves our bodies without our insight makes it considerably alarming.
    The paper comprises of several interfaces in between several layers of SDN architecture. There are four interfaces defined eastbound westbound northbound southbound each. Different kinds of methods & techniques, as well as targeted... more
    The paper comprises of several interfaces in between several layers of SDN architecture. There are four interfaces defined eastbound westbound northbound southbound each. Different kinds of methods & techniques, as well as targeted audiences, have been displayed concerning literature review is also defined. The components and its pros cons are well signifying its noted value. The major challenges encountered thereafter a reflected in SDN techniques and report cards through it curb its challenges and benefits discussed thereafter. The various segment of SDN is defined including its controllers those are centralized structures, distributed structure and multilayer structure. Therefore, we can conclude by abstracting a graph scale through which an audience can easily simplify its implementation thereon.
    Obesity in teenagers and adults has increased worldwide, with serious impact and consequences for health in the short and long term. Technology has allowed to discover new ways of treating diseases and problems with health issues, and... more
    Obesity in teenagers and adults has increased worldwide, with serious impact and consequences for health in the short and long term. Technology has allowed to discover new ways of treating diseases and problems with health issues, and data mining has become a relevant area of research and discovery, especially in recent years due to its precision and reliability analyzing datasets of patients to detect diseases and facilitate their prevention. The goal of this study was to identify the techniques and algorithms in data mining most commonly used, to detect several factors that favor the apparition of obesity issues and to determine the reliability of those methods, based on the results obtained from a data mining model. Data mining methods as simple regression and decision trees, are most commonly used to detect obesity levels, where the simple regression method was found in 19% of the articles reviewed and the decision trees method was used in 11% of them.
    About half a decade ago when the idea of smart glass technology was still promising, popular opinion had it that by the year 2020, this interactive technology would have gained so much advancement as a mainstream consumer product and... more
    About half a decade ago when the idea of smart glass technology was still promising, popular opinion had it that by the year 2020, this interactive technology would have gained so much advancement as a mainstream consumer product and these products became more fashionable, socially acceptable and functional. However, the result from this study has shown that the smart glasses concept has been a rather slow one. Smart glasses are still not mainstream consumer products. Rather, they are specialized tools mainly adopted by the industry for various tasks despite their shortcomings. This paper investigates the field of smart glass technology, providing both an overview of existing products and their application, and also revealed 5 major areas in which smart glasses cannot be easily accepted as consumers product while providing a a research road map for future research work in the usage and acceptance of smart glass technologies for consumers product.
    In today’s world which is getting digitalized day by day and the data is increasing exponentially it becomes very important to check for the data security and privacy. As with the increasing data, the threat to that data is also... more
    In today’s world which is getting digitalized day by day and the data is increasing exponentially it becomes very important to check for the data security and privacy. As with the increasing data, the threat to that data is also increasing at the same rate. The data which is getting uploaded on the internet or any other network needs to be safe for the sake of the company’s private data or for the people who are using that network. In this paper the authors had discussed the various types of attacks, threats on the network from which the company or an individual needs to be careful to protect the company from any malicious activity. This paper also explores the various security measures to protect the data and offer proper security and privacy to it.
    From socialising and sharing the one and only social media platform which had officially stepped up into the era of digital marketing is instagram. But using a social networking application for marketing other than application especially... more
    From socialising and sharing the one and only social media platform which had officially stepped up into the era of digital marketing is instagram. But using a social networking application for marketing other than application especially designed for marketing is a far more different way. When instagram was launched in 6 October 2010, at a time it is used for sharing photos and videos. In the year 2016 instagram officially announced to use their account as a professional account i.e. a business profile. Nowadays instagram is one of the most popular or say the one and only social networking platform which is officially meant for business as well as social networking .Over the sky of 5 billion users, instagram popularity cannot be neglected when it comes to marketing. This platform is a pool of customers, audience, communities, talent, art and most important memories. In simple words we can say that instagram is an innovation to digital marketing.  This paper is a brief of the most interesting features of instagram in terms of digital marketing not only in the field of product but also talent.  It includes the measures to switch to a business account from a personal account and how to use its features in the best way to step up in the business world. Moreover it also includes the different ways of promotion, marketing techniques and the effective terminologies to use the application in a productive way.
    Sparing of inordinate vitality utilization is turning into a key worry in networking, on account of the probable practical advantages. These worries, for the most part contended to as "green networking", identify with embeddings vitality... more
    Sparing of inordinate vitality utilization is turning into a key worry in networking, on account of the probable practical advantages. These worries, for the most part contended to as "green networking", identify with embeddings vitality mindfulness in the procedure, in the devices and in the conventions of systems. In this work, author initially figures an increasingly exact meaning of the "green" property. Author further more arrange a couple of principles that are key empowering agents of vitality mindful systems administration examine. Then outline the forward-thinking best in class and offer a list of the applicable work, with a prevalent spotlight on green networking.
    Cloud computing provides on demand services to its client. Data storage is among one in every of the first services provided by cloud computing. Cloud service hosts the knowledge of data owner on their server and user can access their... more
    Cloud computing provides on demand services to its client. Data storage is among one in every of the first services provided by cloud computing. Cloud service hosts the knowledge of data owner on their server and user can access their data from these services. As data owners and servers are different identities, the paradigm of the knowledge is correctly hosted in to the cloud storage server. During this paper discuss the varied techniques that are used for secure data storage on cloud. The different visions of cloud computing is to access traditional supercomputing and also high performing computing power, these are usually performed by different military and also different research facilities they usually use it to perform tens of trillion of different computations per second in different oriented application for example financial portfolios, to deliver various personalized information, to provide users data storage and to power large immersive computer game.
    India is now the habitat of the first number of unsighted people in the humankind. Of the 37 million citizens all over the world, 15 million unsighted people approach India. 75% of these are cases of escapable unsightedness. On the... more
    India is now the habitat of the first number of unsighted people in the humankind. Of the 37 million citizens all over the world, 15 million unsighted people approach India. 75% of these are cases of escapable unsightedness. On the further hand, while our nation must donate 2.5 million eyes every year, the nation's 109 eye banks, where 5 located in Delhi manage to accumulate a maximum of only 25,000 eyes, of that 30% cannot be used.. In the meantime, the shortage of donated eyes is becoming a big dilemma. In 15 million unsighted people in India, three million, 26% of whom are descendants with corneal disorder. However only 10,000 corneal relocates are performed every year due to the reason of donated eye deficiency. The target of the bionic eye is to re-establish fundamental visual cues for people with ocular circumstances such as retinitis pigmentosa that is an inherited state of the eye. A video camera seated on a pair of lenses will attain and advance the images. These images are shipped wirelessly to a bionic implant in the rear of the eye that excites the immobile optic nerves to generate points of illume (phosphenes) that form the basis of the images in the brain. Therefore, even unsighted people can have perception. The eye is a complex optical system that, like other organs, can be harmed due to infection or harm. Therefore, several prostheses have been planned to allow these people to reclaim their vision and relish a full life. This document refers chiefly to the cornea, but also covers the iris and intention. Since someone else was researching artificial retinas, this won't be covered. The truth is, I really wanted to cover it up, since there is a lot of progress in that field while I was still in diapers.
    The dark web also called undetectable web or hidden web are portions of the World Wide Web and its substance are not recorded by standard web crawlers for any kind of reason. Basic uses of the Dark web will be web mail and internet... more
    The dark web also called undetectable web or hidden web are portions of the World Wide Web and its substance are not recorded by standard web crawlers for any kind of reason. Basic uses of the Dark web will be web mail and internet banking yet they are additionally paid for administrations with a paywall, for example, on request video and numerous others. Everybody who utilizes the Web for all intents and purposes visits what could be reflected as Deep Web destinations consistently without known. The contents of the Dark web are holed up behind HTML structures. The surface web is the contrary term to the Dark web. A spot where entire areas of web inside which the entirety of the sites are escaped the perspective on customary web surfers, and furthermore in which the individuals utilizing them are avoided see is eluded as Dark web. Dark web is the mysterious web where it is a lot of hard for programmers, spies or government offices to follow web clients and examine which sites they are utilizing and what are they doing there.
    This paper is aimed particularly at readers concerned with major systems employed in medium to large commercial or industrial enterprises. It examines the nature and significance of the various potential attacks, and surveys the defense... more
    This paper is aimed particularly at readers concerned with major systems employed in medium to large commercial or industrial enterprises. It examines the nature and significance of the various potential attacks, and surveys the defense options available. It concludes that IT owners need to think of the threat in more global terms, and to give a new focus and priority to their defense. Prompt action can ensure a major improvement in IT flexible at a modest marginal cost, both in terms of finance and in terms of normal IT operation. Cyber Security plays an important role in the development of information technology as well as Internet services. Our attention is usually drawn on "Cyber Security" when we hear about "Cyber Crimes". Our first thought on "National Cyber Security" therefore starts on how good is our infrastructure for handling "Cyber Crimes".
    As in a cellular network we know that there is one way of capacity that helps to communication between near located devices when they communicate with each other although sharing information or details between both devices with the help... more
    As in a cellular network we know that there is one way of capacity that helps to communication between near located devices when they communicate with each other although sharing information or details between both devices with the help of core and radio network, where the network will only assign in communication mode. As there are now many generations of cellular communications introduced to the technology like 1G, 2G, 3G, 4G and now there is another generation included 5G. As in 1st to 3rd generation (1G2G3G) all the specifications of communications were same but the speed of network and data had increased but in 4th generation (4G) the specifications changed into a better level of communication and the data speed increased to 5 MB/s and now we are hoping some more advanced features in 5th generation (5G) so that there are some network error that have been issued in 4th generation (4G) and we hope that it will be resolved in 5th generation (5G). And due to this there are billions of devices connected in future. So, we can assume that there are large numbers of connections which are expected to be mixed in the nature, demanding some higher rates of data, decrease the time of delays, enhancing the capacity of the systems. So, the spectrum resources that will be available are limited and need to adapt to different circumstances that are used in mobile network operators that always help in rising demands.
    Machine learning is an application of artificial intelligence in which the machines learn themselves and then work accordingly to the instructions. Basically, machine learning works on the data sets. Data is unprocessed raw facts and... more
    Machine learning is an application of artificial intelligence in which the machines learn themselves and then work accordingly to the instructions. Basically, machine learning works on the data sets. Data is unprocessed raw facts and figures. The machine works on the data, tries to understand and correlate with different fields and then give output. In this paper, we will be discussing the basic knowledge required to build up the machine learning models, the hypes and reality related to machine learning and most importantly how machine learning and interrelated fields are used in various platforms. This is one of the fast-growing fields in the present world, as it is reducing the load of computation and helping companies to strategies accordingly. But as every coin has two sides, machine learning also has its own positive and negative views as day by day it is reducing human efforts. Almost every multinational company is using this technology for solving the problems of society and people. Machine learning is also linked to other branches like artificial intelligence, data science, computational statistics and probability. These all fields are linked with one another, machine learning is all about the mathematics mainly probability and statistics. Analyzing the data depending upon the various factors and then work according to them is a part of machine learning.

    And 290 more