Academia and industry at first glance appear to be strange bedfellows. One focuses on the theoretical and conceptual, whilst the other is driven by the practicalities of deadlines, goals and ultimately, profit.
I’m in the privileged position of working on both sides of the fence. I’m Professor of Computer Science at the University of San Francisco as well as Chief Scientist at SnapLogic, a provider of application and data integration software. I have worked on AI in both of these roles, and throughout my 20 year career I’ve come to the realisation that, when it comes to driving innovation, these two distinct spheres need to work together.
AI promises to be the most important technology of the future, and if the lofty ambitions and out-of-the-box thinking of academia can find synergy with the can-do attitude, urgency and resources of industry, we’ll see an explosion in its applications. In fact, I believe that AI and ML technology won’t just be a nice feature but will be a requirement for all applications go forward.
Data, Data everywhere
This collaboration between industry and academia has been growing for some time and, like most things in technology, it all boils down to the data. For the first 10 to 15 years of my career I was on the traditional academic track, and whenever we were undertaking research or publishing papers, there was always one consistent stumbling block – we lacked real world data.
That all changed around 10 years ago with the emergence of social and search. The Google’s, Twitter’s and Facebook’s of the world were challenged by data growth problems that exceeded the capacity of conventional database technology, so they built custom solutions to house their vast datasets.
Of course, the large search and social companies didn’t curate all of this user and behavioural data for the distinct purpose of fuelling AI development, but they saw the potential that it held. The tipping point came when the likes of Twitter first invited academics to analyse this data. Suddenly and unexpectedly, a treasure trove of real world data was made available to academics. All that theoretical machinery that we had been developing in academia could be realised as real recommendations and meaningful predictive applications. Ph.D.s in academia started going into industry and working with real data, fuelling further theoretical developments and so on and so forth.
Many of the latest and most promising developments in AI have been forged through this symbiotic relationship, and if it can be strengthened, we’ll see many more breakthroughs in the pipeline.
The positive effects of academia and industry’s relationship aren’t solely limited to datasets. Cultural cross-pollination is, in my opinion, a vital aspect for driving further innovation in the field.
I’ve briefly alluded to this above, but it’s hard to imagine cultures much more different than industry and academia. As someone who was, for the majority of their career, primarily an academic, I’ve experienced first-hand how these two worlds differ, and how, on an individual level, being exposed to the other side of the coin is an enriching experience.
This is perhaps best illustrated with an example. I first joined SnapLogic in 2010, when it was a much smaller company than it is today. This was my first foray into industry, and I was tasked with building a prototype for a machine learning project to be implemented into our data integration platform. The prospect of putting my code in front of my new colleagues, industry veterans, unsettled me.
As someone who had been coding since the age of twelve or thirteen, it wasn’t a lack of skills or experience which caused this reaction, rather the prospect of doing so in a wildly different environment to that which I was used to. In academia, code is rarely reviewed. There isn’t an audience, it’s just an aspect of the job you complete. In industry it’s much more purpose-driven. There are goals and deadlines and milestones. Your work must deliver value for customers.
In the end everything was fine, and that code I wrote still exists in the platform to this day, but it shed some light on how sheltered we are in academia from the realities and stresses of industry. By spending time in industry I learned how to adapt to a different type of coding environment, and am a more well-rounded professional as a result.
These cross-cultural benefits are something I’m bringing into my classroom to better prepare my students for a career in industry. I’m now teaching some of the realities of building production software, which many academics might forego. We’re also bringing some of my students to work on-site at SnapLogic, on real-world AI projects, for course credit.
By exposing these younger people to industry in the earliest parts of their careers, rather than far later on as I did, the next wave of computer scientists will be equipped with not only the curiosity of the academic, but the practicality and work ethic of industry. This, I hope, will spur the next great developments in AI.
The AI future
I strongly believe that AI will have a huge impact on the world of business in the years to come. It will make companies more efficient, freeing up resources to be invested in other areas to enhance existing products or develop new ones. It will change job roles and allow employees to focus on more human tasks that rely on human skills, such as emotional insight, rather than burdened with rote, repetitive tasks.
However, it’s going to be an incremental process, and it won’t happen overnight. If we’re to have any hope of accelerating the timeframe, we’re going to need academics and industry pulling in the same direction, pooling their talents and resources and working in sync. The building blocks for this relationship are already in place, and both sides stand to benefit.
Dr Greg Benson, Professor of Computer Science at University of San Francisco and Chief Scientist at SnapLogic
Image Credit: Enzozo / Shutterstock