Skip to main content

Ranga Vatsavai

Professor

Engineering Building II (EB2) 2254

Publications

View all publications 

Grants

Date: 10/01/21 - 9/30/26
Amount: $15,147,874.00
Funding Agencies: National Science Foundation (NSF)

The Science and Technologies for Phosphorus Sustainability (STEPS) Center is a convergence research hub for addressing the fundamental challenges associated with phosphorus sustainability. The vision of STEPS is to develop new scientific and technological solutions to regulating, recovering and reusing phosphorus that can readily be adopted by society through fundamental research conducted by a broad, highly interdisciplinary team. Key outcomes include new atomic-level knowledge of phosphorus interactions with engineered and natural materials, new understanding of phosphorus mobility at industrial, farm, and landscape scales, and prioritization of best management practices and strategies drawn from diverse stakeholder perspectives. Ultimately, STEPS will provide new scientific understanding, enabling new technologies, and transformative improvements in phosphorus sustainability.

Date: 08/01/22 - 1/31/25
Amount: $1,000,000.00
Funding Agencies: National Science Foundation (NSF)

Plant disease outbreaks are increasing and threaten food security for the vulnerable in many areas of the world and in the US. Climate change is exacerbating weather events that affect crop production and food access for vulnerable areas. Now a global human pandemic is threatening the health of millions on our planet. A stable, nutritious food supply will be needed to lift people out of poverty and improve health outcomes. Plant diseases, both endemic and recently emerging, are spreading and exacerbated by climate change, transmission with global food trade networks, pathogen spillover and evolution of new pathogen genetic lineages. Prediction of plant disease pandemics is unreliable due to the lack of real-time detection, surveillance and data analytics to inform decisions and prevent spread. In order to tackle these grand challenges, a new set of predictive tools are needed. In the PIPP Phase I project, our multidisciplinary team will develop a pandemic prediction system called ????????????????Plant Aid Database (PAdb)??????????????? that links pathogen transmission biology, disease detection by in-situ and remote sensing, genomics of emerging pathogen strains and real-time spatial and temporal data analytics and predictive simulations to prevent pandemics. We plan to validate the PAdb using several model pathogens including novel and host resistance breaking strains of lineages of two Phytophthora species, Phytophthora infestans and P. ramorum and the cucurbit downy mildew pathogen Pseudoperonspora cubensis Adoption of new technologies and mitigation interventions to stop pandemics require acceptance by society. In our work, we will also characterize how human attitudes and social behavior impact disease transmission and adoption of surveillance and sensor technologies by engaging a broad group of stakeholders including growers, extension specialist, the USDA APHIS, Department of Homeland Security and the National Plant Diagnostic Network in a Biosecurity Preparedness workshop. This convergence science team will develop tools that help mitigate future plant disease pandemics using predictive intelligence. The tools and data can help stakeholders prevent spread from initial source populations before pandemics occur and are broadly applicable to animal and human pandemic research.

Date: 05/10/21 - 6/06/24
Amount: $370,068.00
Funding Agencies: Intelligence Advanced Research Projects Activity (IARPA)

Develop novel approaches for peta-byte sized remote sensing image data management and analysis. Develop spatiotemporal indexing scheme, spatiotemporal datacube system and associated components such as interpolation, reprojection, and caching. Implement parallel and distributed algorithms to scale datacube operations.

Date: 07/01/19 - 12/31/20
Amount: $60,000.00
Funding Agencies: Center for Accelerated Real Time Analytics (CARTA) - NCSU Research Site

In many real-world applications, data loses its value if it??????????????????s not analyzed in near real time. Examples include natural disasters, crop disease identification and bioterrorism, traffic monitoring, monitoring human activities and public places. Edge computing refers to pushing computing power to the edge of the network or bringing it closer to the sensors. We envision that the embedded supercomputers (e.g., Jetson TX1 and TX2; 1 Teraflop; ~10 Watts) allow computing at the edge (e.g., UAVs). This framework would then allow near real-time analytics on streaming data, which is critical for first responders to national security agencies alike, and compress/reduce data before transmitted to the cloud or data centers. In this project, we propose to develop novel machine learning algorithms on the embedded supercomputers while the data is still in device memory and demonstrate the technology in two real-world applications: crop monitoring and traffic monitoring. Proposed technical work involves following three key stages. (i) Generate a statistical model from historical data (e.g., spectral signatures of different crops) by using statistically principled mixture model (e.g., Gaussian Mixture Model (GMM)), (ii) As the data is being acquired compare new (streaming) data with the GMM model to identify any anomalous patterns (e.g., weeds), (iii) generate event signal about the anomaly before the data is being compressed and transferred out from devise memory.

Date: 07/31/14 - 9/30/20
Amount: $24,611,102.00
Funding Agencies: US Dept. of Energy (DOE)

NC State University, in partnership with University of Michigan, Purdue University, University of Illinois at Urbana Champaign, Kansas State University, Georgia Institute of Technology, NC A&T State University, Los Alamos National Lab, Oak Ridge National Lab, and Pacific Northwest National lab, proposes to establish a Consortium for Nonproliferation Enabling Capabilities (CNEC). The vision of CNEC is to be a pre-eminent research and education hub dedicated to the development of enabling technologies and technical talent for meeting the grand challenges of nuclear nonproliferation in the next decade. CNEC research activities are divided into four thrust areas: 1) Signatures and Observables (S&O); 2) Simulation, Analysis, and Modeling (SAM); 3) Multi-source Data Fusion and Analytic Techniques (DFAT); and 4) Replacements for Potentially Dangerous Industrial and Medical Radiological Sources (RDRS). The goals are: 1) Identify and directly exploit signatures and observables (S&O) associated with special nuclear material (SNM) production, storage, and movement; 2) Develop simulation, analysis, and modeling (SAM) methods to identify and characterize SNM and facilities processing SNM; 3) Apply multi-source data fusion and analytic techniques to detect nuclear proliferation activities; and 4) Develop viable replacements for potentially dangerous existing industrial and medical radiological sources. In addition to research and development activities, CNEC will implement educational activities with the goal to develop a pool of future nuclear non-proliferation and other nuclear security professionals and researchers.

Date: 09/16/16 - 12/31/19
Amount: $97,260.00
Funding Agencies: US National Park Service

The purpose of the updated and extended project is to provide additional technical support, consultation and research related to developing and evaluating new methods in the application of geospatial analysis for the Rivers, Trails, and Conservation Assistance (RTCA) program within the Conservation and Outdoor Recreation (COR) Branch of the National Park Service (NPS). The additional technical support, consultation and research activities will include, but are not limited to: 1) developing training material for using the RTCA web mapping application; 2) incorporating additional COR Branch program data into the existing RTCA Enterprise database; and 3) enhancing the current RTCA web mapping application by incorporating existing themed GIS web services.

Date: 02/01/18 - 8/31/19
Amount: $50,000.00
Funding Agencies: Lenovo

Massive amounts of remote sensing data are being collected and archived from satellites and airborne platforms (including drones) on daily basis. This data supports a wide range of applications of national importance. Examples of applications include crop type mapping, forest mapping, urban neighborhood mapping, damages due to flooding, hailstorms, and forest fires, impacts of climate change on crops, unusual crop detection (e.g., poppy plantations), changes in biomass, understanding complex interaction between food, energy, and water, etc. Classification of these high-resolution images requires object and arbitrary patch based classification to capture relevant spatial context. The advent of multiple instance learning and deep learning took the natural image processing community by storm. However, its application to satellite images has been slow due to training data and computational requirements. In this project, we develop deep learning algorithms for classification of satellite images and scale these algorithms on Lenovo/Intel??????????????????s new architectures and software infrastructure (e.g., Neon, Caffe, Theano, and MXNet).

Date: 01/01/17 - 8/31/18
Amount: $21,237.00
Funding Agencies: Omidyar Network

Slums have become an inescapable feature of cities in the developing world, and the number of people living in slums has increased rapidly, coming close to 1 billion and rising higher (UN-Habitat 2010). Relatively little is known, however, about patterns of slum development over periods of time and about factors associated with progressive improvements. One of the objectives of this research is to develop a prototype methodology for semi-automatic slum identification and categorization that can speedily and reliably be adapted for use in other cities.

Date: 07/01/17 - 3/31/18
Amount: $45,060.00
Funding Agencies: US Army

Datasets being generated by experiment and simulation today are increasingly large, and nations across the world ?????????????????? including China, the United States, Europe, and Japan ?????????????????? have all invested heavily in developing computers capable of processing or generating these datasets. These datasets come from applications in many areas, and are driven by national security issues as well as industries of strategic value. Large-scale computing is seen as driving technological developments in biology and biomedicine, high-energy physics (a key to stockpile stewardship), and materials science. All of these are areas where the United States has traditionally led the world. However, recent developments have placed China as the leader in building large-scale computers and there is a concern that this could result in the loss of a leadership role for the United States in many of the related technologies. With this in mind, the United States has placed a renewed emphasis on developing exascale computer platforms, and strategically, on the development of algorithms which can make use of large computers to support decisions in science and engineering ?????????????????? an area where the United States arguably still leads the world. In this proposal, we propose adapting Kitware??????????????????s Catalyst and Cinema platforms to perform new summarization tasks including compression scalably on these new architectures. To demonstrate the effectiveness of these summarizations for this phase I project, we will adapt a simulation program to use Catalyst for in-situ processing ?????????????????? saving only the dynamic summarization ?????????????????? in order to provide stakeholders with the information necessary to make a decision and be confident in the simulation process.

Date: 09/06/17 - 9/30/17
Amount: $14,990.00
Funding Agencies: US Dept. of Energy (DOE)

Scaling-up scientific data analysis and machine learning algorithms for data-driven discovery is a challenging task. Despite the growing need for analysis from science domains that are generating ???????????????Big Data?????????????????? from instruments and simulations, building high-performance analytical workflows of data-specific algorithms has been an impediment due to: (i) evolving nature of the ???????????????Big Data?????????????????? hardware and software architecture landscape, (ii) newer architectures impose new programming models, and (iii) lack of understanding of data-parallel kernels of analysis algorithms and their performance on different architectures and programming environments. NCUS will conduct research on benchmarking core graph kernels and computing primitives.


View all grants