Computational Medicine (CM) is an emerging discipline in the area related to the development of quantitative approaches for understanding the mechanisms, diagnosis and treatment of diseases through applications of mathematics, engineering and computer science. Thanks to computational medicine, a California startup named Carmenta invented a blood test that finds the proteins associated with preeclampsia during pregnancy—providing a level of diagnostic accuracy that has not been available before and the company made the product in just about two years. IBM’s computational biology centre is researching on Making sense of the Data Explosion in Genomics, advanced precision oncology, Cardiac Modeling, Neuroimaging, Neuroanalysis, and Modeling the Brain, Simulations of Biomolecular Systems, Protein Science.
UNC School of Medicine works on the field of Cognitive Computing / Machine Learning, Image Analysis / Computer Vision, Computational and Mathematical Modeling, Bioinformatics and Computational Genomics, Health Informatics and Network Analysis. These technical areas help in computational biomedicine and bioinformatics.
According to the University of Michigan, Medical school, this field covers the research in the following areas:
- Genomics, regulatory genomics and epigenomics
Research works in these areas span from basic science to clinical applications, as well as the development of tools and methods that enable the advancement of the field. They develop computational approaches for mining multiple types of genomics data and understanding the impact of genomic variation in context.
- Protein structure( the building blocks of life), proteomics, and alternative splicing
Researchers use experimental approaches to measure, and use simulations and mathematical modelling to predict protein interactions in regulatory networks. Applying our understanding of protein structure and function to clinical research, they use informatics tools, simulation and modelling to facilitate drug discovery and design. Researchers also develop computational methods to process and extract biological information from complex proteomic datasets.
- Multi-“omics” integrative bioinformatics
Researchers develop bioinformatics tools to enable multi-omics data integration and find meaningful interpretations. They develop tools and methods for analysis and integration of genomic, transcriptomic, metabolomic and epigenomic data
- Systems biology and networks analysis
Researchers combine experimental biology with computational modelling, simulation and bioinformatics, systems biology aims to understand how a biological system (cells, organs, an organism, or a population of organisms) functions. As a component of systems biology, network analysis applies theories that have come from the study of computer networks, social networks and physical networks to biological systems, to generate predictive models of the behaviour of biological systems.
- Biomedical data science, translational bioinformatics, and pharmacogenomics
Researchers create and use bioinformatics pipelines and machine learning methods, coupled with imaging techniques, to identify regulatory variants and provide insight into the genetic and epigenetic mechanisms. Researchers develop models of dynamic genetic networks during disease processes and use computational modelling to understand gene splicing (Guan). They design sensors to collect and analyze physiological signals and images and analyze such data in clinical decision support
- Methodological development in Computational Biology
In the subject called Genomics, our methodological development includes large-scale association analysis, meta-analysis and imputation
Researchers analyse complex regions of the genome that are not easily resolved through modern sequencing approaches and the integration of multiple types of genomic data, algorithms for integrating multiple types of genomic data, and pattern recognition and the analysis of evolutionary history in genomics data. This research area also includes developing computational methods for processing and analyzing complex proteomic datasets.
- Applications to complex genetic diseases (psychiatric disorders, cancer, etc.)
Researchers test theories and methods in bioinformatics and computational biology in the study of these diseases and directly contribute to their diagnoses and treatments.
Highly powerful Computers that can perform billions of operations a second are known as supercomputers. For example, Barcelona Supercomputing Centre (BSC), an institution, that houses the most important computing cluster on the Iberian Peninsula, is working on it. Typically, supercomputers are used to solve problems that are so complex they exceed the calculating capacity of a conventional computer. Computational biology helps with the ways to create drugs, for example, using mathematical models to predict activity in molecules and design molecules that have the optimal qualities. Researchers also use it to understand how proteins work and how to simulate the behaviour of a cell based on its basic elements. In 2013 Nobel Prize was awarded for the development of software models that can be used to understand complex chemical processes. We could view life as a kind of highly complicated chemical reactor in which everything happens according to a complex stochastic regime. Hence the difficulties we face in understanding it. Computer simulations allow us to control conditions that we define perfectly and then to watch what happens in silico—that is, using a computation simulation—in order to predict what is happening in vivo.
Research in the field of biomedicine with the help of computers is saving lives and will save many more in the future.
– Rudra Prakash Sarkar