Editor IJATCA
Fca, Publication, Department Member
- International Journal of Advanced Trends in Computer Applications (IJATCA) is a leading international e-journal for p... moreInternational Journal of Advanced Trends in Computer Applications (IJATCA) is a leading international e-journal for publication of new ideas. IJATCA is a peer-reviewed, open access scholarly journal that publishes original research works and review articles in all areas.edit
This paper aims to perform satellite image processing using Machine Learning models and evaluate their prediction scores. This research will classify satellite images into four distinct categories, namely "green area," "desert," "water,"... more
This paper aims to perform satellite image processing using Machine Learning models and evaluate their prediction scores. This research will classify satellite images into four distinct categories, namely "green area," "desert," "water," and "cloudy, and will train and evaluateseveral criteria to highlight the model's exceptional performance, including precision, recall, F1-scores, and total accuracy. The model exhibits exceptional accuracy in correctly predicting and identifying positive instances, as seen by the near-perfect scores achieved for precision and recall across most classes. The F1 scores demonstrate a cohesive equilibrium across various measures, indicating the approach's efficacy. Significantly, the model has a remarkable overall accuracy rate of 99%, emphasizing its proficiency in precise picture categorization. The use of macro and weighted averages highlights the resilience and uniformity of its performance, irrespective of variations in class distribution. The findings presented in this study provide evidence supporting the appropriateness of the model for a range of applications, with a particular emphasis on computer vision and machine learning. Evaluation measures like accuracy, recall, and F1-score provide a detailed analysis of the model's capabilities, rendering them essential for assessing classification models.
Research Interests:
Smart hospitals, utilizing advanced technologies, seek to transform patient care, make processes more efficient, and better use resources. Artificial Intelligence (AI) is pivotal in transitioning traditional health centers into smart,... more
Smart hospitals, utilizing advanced technologies, seek to transform patient care, make processes more efficient, and better use resources. Artificial Intelligence (AI) is pivotal in transitioning traditional health centers into smart, adaptive environments. This article delves into how AI is used in such hospitals, emphasizing its role in elevating patient care, streamlining operations, and championing a patient-focused model. AI in these settings covers areas like medical imaging, diagnostics, predictive insights, patient interaction, and aiding clinical decisions. For instance, AI tools for diagnosis have shown impressive precision in pinpointing issues quickly through various imaging techniques. Predictive tools help track disease trends, streamline clinical tasks, and predict potential future hospital visits, leading to more tailored patient care. Additionally, AI promotes patient involvement via tools like virtual aides, chatbots, and distant health monitoring, enabling people to have more control over their health. Merging AI with clinical decision-making tools supports medical professionals in making informed decisions, leading to better patient results. However, using AI in this context also brings forth challenges related to data security, potential biases, regulatory adherence, and the necessity for crossdisciplinary teamwork. This article underscores the need to tackle these hurdles for an ethical and accountable application of AI in health environments. To conclude, infusing AI into smart hospitals can significantly reshape healthcare, leading to more personalized, data-informed, and efficient patient care. As AI progresses, its union with human expertise is set to usher in a new intelligent healthcare era, promising better patient experiences, improved results, and ultimately, a healthier global community.
Research Interests:
The primary aim of this research paper was to determine optimal threshold values for vital parameters in different medicinal batches, guaranteeing high standards of quality and safety. To accomplish this objective, the study harnessed the... more
The primary aim of this research paper was to determine optimal threshold values for vital parameters in different medicinal batches, guaranteeing high standards of quality and safety. To accomplish this objective, the study harnessed the combined potential of state-of-the-art machine learning algorithms, linear regression models, rigorous statistical techniques like ETL and Airflow. Most common algorithms in both statistics and machine learning is linear regression. By leveraging these advanced data processing methodologies, the research looks forward to enhance the pharmaceutical industry's ability to assess and maintain the quality and safety of medicines effectively. The ETL process starts with extracting data from Hive, that offers efficient storage and processing capabilities, making it an ideal source for data extraction. The extracted data is then transformed using ML and data analysis techniques. The transformation logic is implemented using Jupyter Notebooks, it provides an interactive environment for developing and executing code, making it easy to apply ML algorithms and data manipulation techniques. After the data has been transformed it is loaded back into PostgreSQL, a powerful and scalable relational database management system that provides robust data storage and querying capabilities, making it an ideal destination for the transformed data. The loaded data is organized within PostgreSQL tables. The transformed data stored in PostgreSQL can then be used by the final product, which could be a web application, a reporting dashboard, or any other system that requires access to the process an enriched data. These tools enabled for formation of threshold values for parameters of different medicines with high accuracy by efficient data processing, analysis, and visualization, allowing users to make data-driven decisions and gain insights from the transformed data.
Research Interests:
Today, the integration of Fin-tech with IoT and Artificial Intelligence is rapidly challenging banks. FinTech offers fast support and enhanced convenience, making it highly desirable for customers. This article explores the active and... more
Today, the integration of Fin-tech with IoT and Artificial Intelligence is rapidly challenging banks. FinTech offers fast support and enhanced convenience, making it highly desirable for customers. This article explores the active and prominent areas of FinTech, including Cryptocurrency and digital cash, Smart contracts, Open banking, Block-chain technology, Reg-tech, Insurtech, Unbanked services, Robo-advisors, and Crowdfunding. The paper presents a cohesive research analysis based on a critical evaluation of the literature. It also provides a comprehensive review of the history of FinTech and its various domains. Technologies such as Machine Learning, AI, and predictive analytics have a direct impact on overall business policies, revenue generation, and resource optimization. To summarize, FinTech is rapidly merging with IoT and AI, posing a significant challenge to traditional banks. The key features of FinTech, including fast support and improved convenience, are highly attractive to customers. The areas classified under FinTech, such as Cryptocurrency, Smart contracts, Open banking, and more, are explored in this article. Additionally, the paper offers a critical assessment of the literature, presenting research themes and a historical overview of FinTech. The integration of technologies like Machine Learning and AI in financial services has profound implications for business strategies, revenue, and resource management.
Research Interests:
Sentiment analysis, a subfield of natural language processing (NLP), involves the automated identification and classification of sentiment or opinion expressed in text. Traditionally, sentiment analysis has focused on English language... more
Sentiment analysis, a subfield of natural language processing (NLP), involves the automated identification and classification of sentiment or opinion expressed in text. Traditionally, sentiment analysis has focused on English language texts, but with the increasing availability of multilingual data on social media, online reviews, and news articles, there is a growing demand for sentiment analysis in multiple languages. Analyzing sentiment in multiple languages presents unique challenges due to linguistic differences, cultural nuances, and the availability of labeled data. This paper provides an analysis of features based machine learning approaches used for sentiment analysis in multiple languages. It discusses the challenges and considerations specific to multilingual sentiment analysis and provides insights into the performance and effectiveness of different machine learning models. The goal is to explore the performance, effectiveness, and generalization capabilities of different machine learning models across diverse linguistic contexts.
Research Interests:
Heart failure is a serious cardiovascular condition that affects millions of people worldwide and poses a significant burden on healthcare systems. Early detection and prediction of heart failure can significantly improve patient outcomes... more
Heart failure is a serious cardiovascular condition that affects millions of people worldwide and poses a significant burden on healthcare systems. Early detection and prediction of heart failure can significantly improve patient outcomes by enabling timely intervention and management. In recent years, machine learning techniques have emerged as powerful tools for developing predictive models in healthcare. This abstract presents a heart failure prediction system that utilizes machine learning algorithms to identify individuals at risk of developing heart failure. The system incorporates various features such as demographic information, medical history, vital signs, and laboratory test results to build a predictive model. Data preprocessing techniques are applied to handle missing values, normalize the data, and address data imbalances. The selected machine learning algorithm undergoes training and validation using a large dataset of heart failure cases. The model's performance is evaluated based on accuracy, sensitivity, specificity, and area under the ROC curve. The system's user-friendly interface allows healthcare professionals to input patient data, view the prediction results, and make informed decisions regarding patient care. The implementation of the heart failure prediction system involves the use of modern tools and technologies such as Scikit-Learn, TensorFlow, and Keras for algorithm selection and model development. Data storage and retrieval are handled using a relational database management system such as MySQL. Privacy and ethical considerations are addressed through robust data protection measures and compliance with relevant regulations. The evaluation and results analysis demonstrate the system's effectiveness in predicting heart failure cases with high accuracy and sensitivity. A comparison with existing prediction systems highlights the system's competitive performance and its potential to enhance early detection and intervention. In conclusion, the heart failure prediction system presented in this abstract offers a valuable tool for healthcare professionals in identifying individuals at risk of heart failure. The system's implementation, evaluation, and comparison with existing approaches contribute to the growing body of knowledge in the field. Future work could focus on enhancing the system's interpretability, generalizability, and integration with real-time monitoring devices for continuous heart failure risk assessment.
Research Interests:
Blood vessels are important biomarkers in skin lesions both diagnostically and clinically. Detection and quantification of cutaneous blood vessels provide critical information towards lesion diagnosis and assessment. In this paper, a... more
Blood vessels are important biomarkers in skin lesions both diagnostically and clinically. Detection and quantification of cutaneous blood vessels provide critical information towards lesion diagnosis and assessment. In this paper, a novel framework for detection and segmentation of cutaneous vasculature from dermoscopy images is presented and the further extracted vascular features are explored for skin cancer classification. Given a dermoscopy image, we segment vascular structures of the lesion by first decomposing the image using independent component analysis into melanin and hemoglobin components. This eliminates the effect of pigmentation on the visibility of blood vessels. Using k-means clustering, the hemoglobin component is then clustered into normal, pigmented and erythema regions. Shape filters are then applied to the erythema cluster at different scales. A vessel mask is generated as a result of global thresholding. The segmentation sensitivity and specificity of 90% and...
Research Interests:
International Journal of Advanced Trends in Computer Applications (IJATCA) extensively cover research work with cutting edge forefront innovations and adequate promoting methods. The group of researchers and academicians who are the part... more
International Journal of Advanced Trends in Computer Applications (IJATCA) extensively cover research work with cutting edge forefront innovations and adequate promoting methods. The group of researchers and academicians who are the part of International Journal of Advanced Trends in Computer Applications contribute to provide assistance in reviewing the manuscript and enhance that by giving an aide for composing highly quality research paper through proper evaluations.
International Journal of Advanced Trends in Computer Applications is a half yearly, peer-reviewed international research e-journal that addresses both applied and theoretical issues. The scope of the journal encompasses research articles, original research reports, reviews, short communications and scientific commentaries in the fields of computer science and engineering and other related areas. The journal addresses the issues for the vertical and horizontal applications in their respective areas.
The aim of IJATCA is to provide an international forum for the publication and dissemination of original work that contributes to the understanding of the main and related disciplines of engineering, either empirical or theoretical. It also publish peer reviewed research and review articles in rapidly developing field of computer science engineering and technology and provides a venue for high-caliber researchers, PhD students and professionals to submit on-going research and developments. It is an international scientific journal that aims to contribute to the constant scientific research and training, so as to promote research in the field of computer science. This journal is an e-journal having full access to the research and review papers.
International Journal of Advanced Trends in Computer Applications is a half yearly, peer-reviewed international research e-journal that addresses both applied and theoretical issues. The scope of the journal encompasses research articles, original research reports, reviews, short communications and scientific commentaries in the fields of computer science and engineering and other related areas. The journal addresses the issues for the vertical and horizontal applications in their respective areas.
The aim of IJATCA is to provide an international forum for the publication and dissemination of original work that contributes to the understanding of the main and related disciplines of engineering, either empirical or theoretical. It also publish peer reviewed research and review articles in rapidly developing field of computer science engineering and technology and provides a venue for high-caliber researchers, PhD students and professionals to submit on-going research and developments. It is an international scientific journal that aims to contribute to the constant scientific research and training, so as to promote research in the field of computer science. This journal is an e-journal having full access to the research and review papers.
Research Interests: Computer Science, Artificial Intelligence, Remote Sensing, Machine Learning, Applications of Machine Learning, and 9 moreStatistical machine learning, Biomedical informatics, Machine Intelligence, Machine Learning and Pattern Recognition, Satellite Imagery, Computer Science and Information Technology, Remote Sensing and GIS applications in Forestry, Computer Science and Engineering, and Electrical Engineering and Computer Science
A MANET can be defined as an autonomous system of nodes or MSs(also serving as routers) connected by wireless links, the union of which forms a communication network modeled in the form of an arbitrary communication graph. The self... more
A MANET can be defined as an autonomous system of nodes or MSs(also serving as routers) connected by wireless links, the union of which forms a communication network modeled in the form of an arbitrary communication graph. The self configuring means that any mobile nodes can join or leave the network when they want .It is the decentralized type of network in which mobile nodes can move from one location to another. Due to random movability of the mobile nodes, the two factors route establishment, route maintenance becomes the major problem of MANET networks. This main Spotlight of this research paper is the route establishment & route maintenance which are properties of MANET network. The EETC protocol is the route establishment and route maintenance protocol in which broker route will be recovered on the basis of node connectivity. The node, which has maximum connectivity, is selected as the best node for route recovery in EETC Protocol. In this research work, the EETC protocol is further improved by adding buffer size parameter for route recovery which also maintains & improves quality of service like better throughtput, Less Energy Consumption, High Packet Delivery Ratio , Low End to end delay and Less Packet loss & Less Overhead in the network The proposed IEETC protocol simulation results perform well as compared to existing EETC protocol in terms of certain parameters.
Research Interests:
Mobile adhoc network are flexi-mobile, they use wireless connections to connect to various networks. An ad-hoc network is a collection of wireless mobile hosts forming a network without central epicentre. The automated factor means that... more
Mobile adhoc network are flexi-mobile, they use wireless connections to connect to various networks. An ad-hoc network is a collection of wireless mobile hosts forming a network without central epicentre. The automated factor means that any mobile nodes can join or leave the network at any point of time when they want which causes many prob like Qos parameters is affected The EETS is the improved version of AODV protocol for path recovery in mobile adhoc networks. In the EETS protocol, when the mobile node change its location then link failure occurred in the network. The EETS protocols works on the node connectivity factor for the link recovery. When any node detects link failure in the network, then the node with which maximum number of nodes is connected is selected as the best node for link recovery. The EETS protocol performs well in terms of certain parameters but for the link recovery it donot include quality of service parameters. In this research work, improvement in the EETS protocol will be proposed by applying quality of service parameters for path recovery & Better Link Stability in mobile adhoc networks. The quality of service parameters are like Overhead, Energy consumption and delay.
Research Interests:
Skin cancer accounts to be a standout amongst the most prevalent types of carcinoma ailments, particularly among Caucasian offspring and pale-skinned persons. Specifically, the melanocytic dermis lesion are conjectured as the most lethal... more
Skin cancer accounts to be a standout amongst the most prevalent types of carcinoma ailments, particularly among Caucasian offspring and pale-skinned persons. Specifically, the melanocytic dermis lesion are conjectured as the most lethal among three pervasive skin carcinoma ailments and the second most communal type amongst youthful grown-ups who are 15-29 years old. These apprehensions have impelled the requirement of automated systems for the diagnosis of skin carcinomas within a limited time frame to reduce unnecessary biopsy, proliferating the momentum of diagnosis and giving reproducibility of indicative outcomes. In this survey paper a brief overview of automated detection and segmentation of vascular structures of skin lesions is presented
Research Interests:
Service-Oriented Architecture is a technique which can be employed to unite various services over the operating systems, platforms and networks. Several organizations fail to completely use SOA and the reason behind this is underdeveloped... more
Service-Oriented Architecture is a technique which can be employed to unite various services over the operating systems, platforms and networks. Several organizations fail to completely use SOA and the reason behind this is underdeveloped adoption process. The author has conducted an exploratory study to explore the recent concerns and numerous practices related with SOA adoption along with assessing various maturity levels used, role of information technology in SOA adoption. The required information is gathered by conducting a literature survey that explored the previously done work on SOA adoption by surfing through the net, reading journals and papers. The paper focuses on various significant issues related with adoption of SOA in organizations.
Research Interests:
A distributed denial-of-service (DDoS) attack is one of the most powerful weapons on the internet. Research indicates that several works been have done to mitigate DDoS attacks on Linux based Servers. However, the type of DDoS attack... more
A distributed denial-of-service (DDoS) attack is one of the most powerful weapons on the internet. Research indicates that several works been have done to mitigate DDoS attacks on Linux based Servers. However, the type of DDoS attack covered were mostly HTTP Get Flood attacks on port 80 and 443. More so, the IPTables firewall rules used were not automated using Bash scripts to make it portable and the firewall rules in most cases were written to mitigate attacks coming from a single IP address. This study will therefore expand the scope of the mitigating DDoS attacks using IPTables to include TCP SYN Flood attacks, UDP Flood attacks and PING (ICMP) Flood attacks. After carrying out the test when the BASH scripts have been executed, DDoS attacks in form of TCP SYN Flood, UDP Flood and ICMP (Ping) Flood were generated using HPing3 and they were successfully mitigated as the Linux Server dropped packets that make up these attacks while allowing legitimate traffic and users to access resources on the Server.
Research Interests:
The main aim of enhancement of an image is to apply some operation on the image in spatial or frequency domain so that the resultant image is more appropriate than the original image for a particular application. In this paper a method to... more
The main aim of enhancement of an image is to apply some operation on the image in spatial or frequency domain so that the resultant image is more appropriate than the original image for a particular application. In this paper a method to enhance low contrast images in spatial domain is proposed. The method first applies power law to correct the image tone and then applies improved histogram equalization to reduce the problem of noise amplification as in histogram equalization. The performance is measured both quantitatively and qualitatively and the results reveal that method is better than the state-of-the-art and can be applied to many image processing and computer vision application.
Research Interests:
Path planning has been one of the major research challenges in a Mobile Robot navigation system. Researchers in this area have recorded significant success. However, there are still research issues. Ant Colony Algorithm (ACO), an... more
Path planning has been one of the major research challenges in a Mobile Robot navigation system. Researchers in this area have recorded significant success. However, there are still research issues. Ant Colony Algorithm (ACO), an intelligent optimization Algorithm has recorded successes in many domains including robot control. In this work, ACO was used to find solutions to the robot routing problem. Simulation results established that the proposed algorithm outperformed the min-max ant system technique.
Research Interests:
With advancements in medical field, technology is being applied for ingenious applications that let doctors to use them as tools in beneficial way. Examining dental radiographs by dentists usually fritter away time and also error prone... more
With advancements in medical field, technology is being applied for ingenious applications that let doctors to use them as tools in beneficial way. Examining dental radiographs by dentists usually fritter away time
and also error prone due to its complex structure. The idea is to analyze dental radiographs in easier way by applying neural networks and transfer learning techniques. These intelligence techniques assist for precise results. The novelty is to apply neural networks on those x-rays to analyze them with aid of transfer learning models. For this, radiograph is taken as input for building model using weights and models in transfer learning. Various architectural models from transfer learning are applied for training x-ray data that yields accurate results. Among applied models, MobileNet architecture with some neural network layers gave error-free results. This x-ray analysis will segregate radiographs having caries and gives output as probability value. This application allows dentists for quick and easier outcome of dental x-rays that rescues time.
and also error prone due to its complex structure. The idea is to analyze dental radiographs in easier way by applying neural networks and transfer learning techniques. These intelligence techniques assist for precise results. The novelty is to apply neural networks on those x-rays to analyze them with aid of transfer learning models. For this, radiograph is taken as input for building model using weights and models in transfer learning. Various architectural models from transfer learning are applied for training x-ray data that yields accurate results. Among applied models, MobileNet architecture with some neural network layers gave error-free results. This x-ray analysis will segregate radiographs having caries and gives output as probability value. This application allows dentists for quick and easier outcome of dental x-rays that rescues time.
Research Interests:
Energy prediction of appliances requires identifying and predicting individual appliance energy consumption when combined in a closed chain environment. This experiment aims to provide insight into reducing energy consumption by... more
Energy prediction of appliances requires identifying and predicting individual appliance energy consumption when combined in a closed chain environment. This experiment aims to provide insight into reducing energy consumption by identifying trends and appliances involved. The proposed model tries to formalize such an approach using a time series forecasting- based process that considers the correlation between
different appliances. The entire work has been conducted in two parts. The first part highlights and identifies the energy consumption trends. The second part focuses on the comparison and analysis of different algorithms. The main objective is to understand which algorithm provides a better result in predicting energy consumption. A
comparison of algorithms for appliance usage prediction using identification and direct consumption reading is presented in this paper. The work is presented on real data taken from the REMODECE database, which comprises 19,735 instances with 29 attributes. The data records the energy for 10 minutes over about 4.5 months.
different appliances. The entire work has been conducted in two parts. The first part highlights and identifies the energy consumption trends. The second part focuses on the comparison and analysis of different algorithms. The main objective is to understand which algorithm provides a better result in predicting energy consumption. A
comparison of algorithms for appliance usage prediction using identification and direct consumption reading is presented in this paper. The work is presented on real data taken from the REMODECE database, which comprises 19,735 instances with 29 attributes. The data records the energy for 10 minutes over about 4.5 months.
Research Interests:
The use of games and games technology has been in use in recent times to investigate their possible impact in teaching abstracts concepts. The underlying hypothesis is that the motivating qualities of games may be harnessed and embedded... more
The use of games and games technology has been in use in recent times to investigate their possible
impact in teaching abstracts concepts. The underlying hypothesis is that the motivating qualities of games may be
harnessed and embedded in a game-based learning system (GLS) to accelerate the comprehension of abstract
concepts. In this paper, an AntHill Invader Game (AIG) framework is presented to guide in the design of gamified
Ant social collaborative behaviour in an Ant Hill Colony. A detailed workflow illustration and mathematical
model for the use of AIG in the development of GLS is presented in this paper.
impact in teaching abstracts concepts. The underlying hypothesis is that the motivating qualities of games may be
harnessed and embedded in a game-based learning system (GLS) to accelerate the comprehension of abstract
concepts. In this paper, an AntHill Invader Game (AIG) framework is presented to guide in the design of gamified
Ant social collaborative behaviour in an Ant Hill Colony. A detailed workflow illustration and mathematical
model for the use of AIG in the development of GLS is presented in this paper.
Research Interests:
In the feature extraction methods, the features have noise like spikes or ridges and gaps. These affect the accuracy. This paper aims to remove the connected noise (ridges) and gap filling in between the two points of roads extracted by... more
In the feature extraction methods, the features have noise like spikes or ridges and gaps. These affect the accuracy. This paper aims to remove the connected noise (ridges) and gap filling in between the two points of roads extracted by an index Normalized Difference Asphalt Road Index (NDARI).The extracted roads from the methodology have connected noise like spikes. This connected noise can remove or smooth by a Subspace Constrained Mean Shift (SCMS) algorithm is a non-parametric and Circle Window method. By using these methods, roads (line) features smoothed and shown in the experimental results.
Research Interests:
In this paper explores on compression of Aadhaar number storage through the concepts of reduced bit level ordering. The Aadhaar number is an unique number used in various government schemes. It is a twelve digit number. The population of... more
In this paper explores on compression of Aadhaar number storage through the concepts of reduced bit level ordering. The Aadhaar number is an unique number used in various government schemes. It is a twelve digit number. The population of our country is more than 135 crores. It means more than 135 crores of Aadhaar numbers will be available. The uniqueness of beneficiary is ensured by avoiding duplicate beneficiary. The uniqueness needs Aadhaar number comparison with minimal amount of time. Hence there is a need for reducing the Aadhaar storage for minimizing the search time. The numbers are represented by using various number representations with their needed storage. The compression of Aadhaar storage implemented by using the concept of bit level ordering which takes sorted integer as an input. The expected saved space shown in graphs and tables. The input from various bit level taken and output obtained as index bit and content bit. The obtained saved space depicted through table and graphs.
Research Interests: Big Data and Compression
Obesity in teenagers and adults has increased worldwide, with serious impact and consequences for health in the short and long term. Technology has allowed to discover new ways of treating diseases and problems with health issues, and... more
Obesity in teenagers and adults has increased worldwide, with serious impact and consequences for health in the short and long term. Technology has allowed to discover new ways of treating diseases and problems with health issues, and data mining has become a relevant area of research and discovery, especially in recent years due to its precision and reliability analyzing datasets of patients to detect diseases and facilitate their prevention. The goal of this study was to identify the techniques and algorithms in data mining most commonly used, to detect several factors that favor the apparition of obesity issues and to determine the reliability of those methods, based on the results obtained from a data mining model. Data mining methods as simple regression and decision trees, are most commonly used to detect obesity levels, where the simple regression method was found in 19% of the articles reviewed and the decision trees method was used in 11% of them.
Research Interests:
About half a decade ago when the idea of smart glass technology was still promising, popular opinion had it that by the year 2020, this interactive technology would have gained so much advancement as a mainstream consumer product and... more
About half a decade ago when the idea of smart glass technology was still promising, popular opinion had it that by the year 2020, this interactive technology would have gained so much advancement as a mainstream consumer product and these products became more fashionable, socially acceptable and functional. However, the result from this study has shown that the smart glasses concept has been a rather slow one. Smart glasses are still not mainstream consumer products. Rather, they are specialized tools mainly adopted by the industry for various tasks despite their shortcomings. This paper investigates the field of smart glass technology, providing both an overview of existing products and their application, and also revealed 5 major areas in which smart glasses cannot be easily accepted as consumers product while providing a a research road map for future research work in the usage and acceptance of smart glass technologies for consumers product.
Research Interests:
In today’s world which is getting digitalized day by day and the data is increasing exponentially it becomes very important to check for the data security and privacy. As with the increasing data, the threat to that data is also... more
In today’s world which is getting digitalized day by day and the data is increasing exponentially it becomes very important to check for the data security and privacy. As with the increasing data, the threat to that data is also increasing at the same rate. The data which is getting uploaded on the internet or any other network needs to be safe for the sake of the company’s private data or for the people who are using that network. In this paper the authors had discussed the various types of attacks, threats on the network from which the company or an individual needs to be careful to protect the company from any malicious activity. This paper also explores the various security measures to protect the data and offer proper security and privacy to it.
Research Interests:
From socialising and sharing the one and only social media platform which had officially stepped up into the era of digital marketing is instagram. But using a social networking application for marketing other than application especially... more
From socialising and sharing the one and only social media platform which had officially stepped up into the era of digital marketing is instagram. But using a social networking application for marketing other than application especially designed for marketing is a far more different way. When instagram was launched in 6 October 2010, at a time it is used for sharing photos and videos. In the year 2016 instagram officially announced to use their account as a professional account i.e. a business profile. Nowadays instagram is one of the most popular or say the one and only social networking platform which is officially meant for business as well as social networking .Over the sky of 5 billion users, instagram popularity cannot be neglected when it comes to marketing. This platform is a pool of customers, audience, communities, talent, art and most important memories. In simple words we can say that instagram is an innovation to digital marketing. This paper is a brief of the most interesting features of instagram in terms of digital marketing not only in the field of product but also talent. It includes the measures to switch to a business account from a personal account and how to use its features in the best way to step up in the business world. Moreover it also includes the different ways of promotion, marketing techniques and the effective terminologies to use the application in a productive way.
Research Interests:
Sparing of inordinate vitality utilization is turning into a key worry in networking, on account of the probable practical advantages. These worries, for the most part contended to as "green networking", identify with embeddings vitality... more
Sparing of inordinate vitality utilization is turning into a key worry in networking, on account of the probable practical advantages. These worries, for the most part contended to as "green networking", identify with embeddings vitality mindfulness in the procedure, in the devices and in the conventions of systems. In this work, author initially figures an increasingly exact meaning of the "green" property. Author further more arrange a couple of principles that are key empowering agents of vitality mindful systems administration examine. Then outline the forward-thinking best in class and offer a list of the applicable work, with a prevalent spotlight on green networking.
Research Interests:
Cloud computing provides on demand services to its client. Data storage is among one in every of the first services provided by cloud computing. Cloud service hosts the knowledge of data owner on their server and user can access their... more
Cloud computing provides on demand services to its client. Data storage is among one in every of the first services provided by cloud computing. Cloud service hosts the knowledge of data owner on their server and user can access their data from these services. As data owners and servers are different identities, the paradigm of the knowledge is correctly hosted in to the cloud storage server. During this paper discuss the varied techniques that are used for secure data storage on cloud. The different visions of cloud computing is to access traditional supercomputing and also high performing computing power, these are usually performed by different military and also different research facilities they usually use it to perform tens of trillion of different computations per second in different oriented application for example financial portfolios, to deliver various personalized information, to provide users data storage and to power large immersive computer game.
Research Interests:
India is now the habitat of the first number of unsighted people in the humankind. Of the 37 million citizens all over the world, 15 million unsighted people approach India. 75% of these are cases of escapable unsightedness. On the... more
India is now the habitat of the first number of unsighted people in the humankind. Of the 37 million citizens all over the world, 15 million unsighted people approach India. 75% of these are cases of escapable unsightedness. On the further hand, while our nation must donate 2.5 million eyes every year, the nation's 109 eye banks, where 5 located in Delhi manage to accumulate a maximum of only 25,000 eyes, of that 30% cannot be used.. In the meantime, the shortage of donated eyes is becoming a big dilemma. In 15 million unsighted people in India, three million, 26% of whom are descendants with corneal disorder. However only 10,000 corneal relocates are performed every year due to the reason of donated eye deficiency. The target of the bionic eye is to re-establish fundamental visual cues for people with ocular circumstances such as retinitis pigmentosa that is an inherited state of the eye. A video camera seated on a pair of lenses will attain and advance the images. These images are shipped wirelessly to a bionic implant in the rear of the eye that excites the immobile optic nerves to generate points of illume (phosphenes) that form the basis of the images in the brain. Therefore, even unsighted people can have perception. The eye is a complex optical system that, like other organs, can be harmed due to infection or harm. Therefore, several prostheses have been planned to allow these people to reclaim their vision and relish a full life. This document refers chiefly to the cornea, but also covers the iris and intention. Since someone else was researching artificial retinas, this won't be covered. The truth is, I really wanted to cover it up, since there is a lot of progress in that field while I was still in diapers.
Research Interests:
The dark web also called undetectable web or hidden web are portions of the World Wide Web and its substance are not recorded by standard web crawlers for any kind of reason. Basic uses of the Dark web will be web mail and internet... more
The dark web also called undetectable web or hidden web are portions of the World Wide Web and its substance are not recorded by standard web crawlers for any kind of reason. Basic uses of the Dark web will be web mail and internet banking yet they are additionally paid for administrations with a paywall, for example, on request video and numerous others. Everybody who utilizes the Web for all intents and purposes visits what could be reflected as Deep Web destinations consistently without known. The contents of the Dark web are holed up behind HTML structures. The surface web is the contrary term to the Dark web. A spot where entire areas of web inside which the entirety of the sites are escaped the perspective on customary web surfers, and furthermore in which the individuals utilizing them are avoided see is eluded as Dark web. Dark web is the mysterious web where it is a lot of hard for programmers, spies or government offices to follow web clients and examine which sites they are utilizing and what are they doing there.
Research Interests:
This paper is aimed particularly at readers concerned with major systems employed in medium to large commercial or industrial enterprises. It examines the nature and significance of the various potential attacks, and surveys the defense... more
This paper is aimed particularly at readers concerned with major systems employed in medium to large commercial or industrial enterprises. It examines the nature and significance of the various potential attacks, and surveys the defense options available. It concludes that IT owners need to think of the threat in more global terms, and to give a new focus and priority to their defense. Prompt action can ensure a major improvement in IT flexible at a modest marginal cost, both in terms of finance and in terms of normal IT operation. Cyber Security plays an important role in the development of information technology as well as Internet services. Our attention is usually drawn on "Cyber Security" when we hear about "Cyber Crimes". Our first thought on "National Cyber Security" therefore starts on how good is our infrastructure for handling "Cyber Crimes".
Research Interests:
As in a cellular network we know that there is one way of capacity that helps to communication between near located devices when they communicate with each other although sharing information or details between both devices with the help... more
As in a cellular network we know that there is one way of capacity that helps to communication between near located devices when they communicate with each other although sharing information or details between both devices with the help of core and radio network, where the network will only assign in communication mode. As there are now many generations of cellular communications introduced to the technology like 1G, 2G, 3G, 4G and now there is another generation included 5G. As in 1st to 3rd generation (1G2G3G) all the specifications of communications were same but the speed of network and data had increased but in 4th generation (4G) the specifications changed into a better level of communication and the data speed increased to 5 MB/s and now we are hoping some more advanced features in 5th generation (5G) so that there are some network error that have been issued in 4th generation (4G) and we hope that it will be resolved in 5th generation (5G). And due to this there are billions of devices connected in future. So, we can assume that there are large numbers of connections which are expected to be mixed in the nature, demanding some higher rates of data, decrease the time of delays, enhancing the capacity of the systems. So, the spectrum resources that will be available are limited and need to adapt to different circumstances that are used in mobile network operators that always help in rising demands.