Ayush Parashar, Co-founder and Vice President of Engineering for Unifi Software
The insideBIGDATA 2019 Executive Round Up features insights from industry executives with lengthy experience in the big data industry. Here’s a look at the insights from Ayush Parashar, Co-founder and Vice President of Engineering for Unifi Software.
Ayush has deep software engineering expertise around big data solutions & has strong domain knowledge around Hadoop, MPP database & Systems, Performance Engineering & Data Integration. Before Unifi he was part of founding engineering team at Greenplum.
The full text of Ayush Parashar’s insights from our Executive Round Up are provide below.
Daniel D. Gutierrez, Managing Editor & Resident Data Scientist – insideBIGDATA.com
insideBIGDATA: AI and its adaptability come with a significant barrier to its deployment particularly in regulated industries like drug discovery – “explainability” as to how AI reached a decision and gave its predictions. How will 2019 mark a new era in coming up with solutions to this very real problem?
Ayush Parashar: For AI generated answers to be broadly trusted and adopted—that requires transparency of the AI decision-making process presented in a human friendly fashion. In other words, we’ll need to visibly show how AI algorithms arrived at a conclusion that an expert in a relevant area can easily understand.
We see that level of transparency emerging in data analytics already where we can see data lineage and a full audit trail from the origin of a data source to any manipulations of that data as it’s joined with other data and served up as an insight. In 2019 and beyond I expect we’ll see even more transparency of AI powering analytics and in more forms – including the broader use of visualizations to instantly show the path for how decisions are arrived at when making diagnosis for patients in the healthcare industry. As companies embrace a culture of self-service data use then transparency around data’s origin to determine its trustworthiness will play an even greater role in ‘explainability’.
insideBIGDATA: What industries do you feel will make the best competitive use of AI, machine learning, and deep learning in the next year? Pick one industry and describe how it will benefit from embracing or extending its embrace of these technologies.
Ayush Parashar: In the last couple of years, AI made it big in the lifestyle industry with the likes of Alexa and Siri making life easy for consumers. And, there have been huge advancements and refinements in transportation with the growing use of AI to improve autonomous driving. However, I believe healthcare is where we’ll see the greatest competitive, and certainly beneficial, use for the ability to aid in decision making for patient diagnosis and care.
Doctors performing robotic surgeries, for example, will have an enhanced experience because of AI recommendations prompting them during procedures. Then there are areas in drug discovery where AI combined with big-data can help inform critical findings. Finally, there are areas where AI, ML and deep learning together may help to provide a quicker and better public health response to cancer by guiding the best biomedical informatics, information and communication technology available.
insideBIGDATA: As deep learning makes businesses innovate and improve with their AI and machine learning offerings, more specialized tooling and infrastructure will be needed to be hosted on the cloud. What’s your view of enterprises seeking to improve their technological infrastructure and cloud hosting processes for supporting their AI, machine learning, and deep learning efforts?
Ayush Parashar: Cloud has been a big factor in helping businesses to innovate. To compete, especially around AI right now, it’s critical for a company to iterate on the latest and greatest tooling that can scale up and scale down in a commodity environment instantly. All the major cloud providers have already innovated on their cloud offerings around AI and that has made the AI algorithm available in a ubiquitous fashion.
In addition to tooling and infrastructure around AI, data management and analytic tools are very important to take AI to the next level. Getting the right data and integrating data from various places, cleansing it and preparing it is pivotal, and it’s often the first step that a data scientist works on for their AI project.
On premises doesn’t always provide this agility. Organizational teams can become more successful if they can do more in less time. Cloud allows for more tools to come together, faster, at scale.
insideBIGDATA: How will AI-optimized hardware solve important compute and storage requirements for AI, machine learning, and deep learning?
Ayush Parashar: Compute requirements are very important for any AI, ML and deep learning tasks. GPUs have made a huge impact on the compute aspect, however, intelligence is moving toward the edge and by definition that is AI. It’s going to make rapid innovations possible when it’s additive to the chip. As AI and ML move towards edge compute models, specialized hardware will soon disrupt and play a big role. It’s easy to envision the use of AI specialized chips in smartphones and home consumer devices of every form including refrigerators, ovens and cars. It’s exciting to see innovation around AI chips including Neural Network Processors, FPGAs, Neuromorphic Chips. From a storage perspective, the biggest impact has been made by use of SSDs.
insideBIGDATA: What’s the most important role AI plays for your company’s mission statement? How will you cultivate that role in 2019?
Ayush Parashar: AI is at the heart of our company’s mission statement to make data discovery and working with data ubiquitous inside an enterprise. It’s central to providing an easy user experience where different personas like a data analyst or a data engineer can work on data easily.
In 2019, we look forward to expanding the AI tools that substantially broaden intuitive and seamless ways to interact with data.