Dr. Ramesh Dharavath H N

SPECIALIZATIONS

Developing Computational Pradigms for BCI

Brain–computer interface (BCI) is a combination of hardware and software that provides a nonmuscular channel to send various messages and commands to the outside world and control external devices such as computers. BCI helps severely disabled patients having neuromuscular injuries, locked-in syndrome (LiS) to lead their life as a normal person to the best extent possible. There are various applications of BCI not only in the field of medicine but also in entertainment, lie detection, gaming, etc. In this work, using BCI a Deceit Identification Test (DIT) is performed based on P300, which has a positive peak from 300 ms to 1000 ms of stimulus onset. The goal is to recognize and classify P300 signals with excellent results. The pre-processing has been performed using the band-pass filter to eliminate the artifacts.

Blockchain auditing convergence in cloud environment

Cloud computing has emerged as a mainstream framework of the utility computing model that provides a reliable infrastructure for data storage as on-demand service at any-time-any-where manner. Thus, outsourcing the data to cloud servers provides the users in a simple, economical and flexible manner to handle the data. However, in the outsourcing strategy, users have no longer control over data so it raises some serious security threats related to data integrity. Many public auditing models were discussed to enable the users to introduce a trusted third party auditor (TPA) to verify the integrity of outsourced data on their behalf. While these current public auditing models are susceptible to procrastinating auditors who might not conduct the scheduled auditing on time. Many public auditing models have been built based on PKI (Public Key Infrastructure) and suffer from the issue of certificate management. To overcome these issues, we construct a certificateless public auditing model contrary to malicious and procrastinating auditors by using the methodology of blockchain. The basic idea is to allow the auditors to record every auditing outcome into the blockchain network as transactions.

Machine Learning and Big Data Analytics

Machine Learning is the science of creating algorithms and program which learn on their own. Once designed, they do not need a human to become better. Some of the common applications of machine learning include following: Web Search, spam filters, recommender systems, ad placement, credit scoring, fraud detection, stock trading, computer vision and drug design. An easy way to understand is this - it is humanly impossible to create models for every possible search or spam, so you make the machine intelligent enough to learn by itself. When you automate the later part of data mining - it is known as machine learning. The term machine learning is self explanatory. Machines learn to perform tasks that aren’t specifically programmed to do. Many techniques are put into practice like supervised clustering, regression, naive Bayes etc.
Big data refers to a process that is used when traditional data mining and handling techniques cannot uncover the insights and meaning of the underlying data. Data that is unstructured or time sensitive or simply very large cannot be processed by relational database engines. This type of data requires a different processing approach called big data, which uses massive parallelism on readily-available hardware.

Distributed Databases: NoSQL Approaches

A distributed database (DDB) is a collection of multiple,logically interrelated databases distributed over a computer network. A distributed database management system (D–DBMS) is the software that manages the DDB and provides an access mechanism that makes this distribution transparent to the users. Distributed database system (DDBS) = DDB + D–DBMS

Cloud Data Management

Cloud computing, often referred to as simply “the cloud,” is the delivery of on-demand computing resources — everything from applications to data centers — over the internet on a pay-for-use basis.Cloud-based applications — or software as a service — run on distant computers “in the cloud” that are owned and operated by others and that connect to users’ computers via the internet and, usually, a web browser. Platform as a service provides a cloud-based environment with everything required to support the complete lifecycle of building and delivering web-based (cloud) applications — without the cost and complexity of buying and managing the underlying hardware, software, provisioning, and hosting. Infrastructure as a service provides companies with computing resources including servers, networking, storage, and data center space on a pay-per-use basis.

CURRENT RESEARCH AREAS

Blockchain & Distributed Computing

Brain Computer Interaction

Virtualization and Scheduling in Cloud Environment

Big Data Mining

Validatiion of ML strategies for Big Data

Distributed Databases

Cloud Databases

Modelling Big Data

Processing Big Data

Virtualization and Scheduling in Cloud Environment

Load balancing approaches

Deep learing strategies in Cloud Environment

Community Detection in Large Big Data Networks