Dotscience debuts DevOps for AI platform
- Continuous improvement at the project level Agile and Scrum
- Project Quality Plan Project Management
- Program Execution Agility Project Management
- Common areas of improvement for Project Managers Project Management
- Wondering to choose CSPO or PSPO? Project Management
- What is Project Control System? Project Management
- Types of Project Costs Project Management
- Should PMP credential be used next to my name on my signature Project Management
DevOps has transformed the way software engineers deliver applications by assembly it possible to collaborate, test and deliver software continuously. Dotscience, the innovator in DevOps for machine learning (ML), appeared from stealth to signal the rise of a new paradigm where ML engineering should be just as easy, fast and safe as recent software engineering when using DevOps techniques.
For data science and ML organizations to realize this DevOps for ML nirvana, the right tooling and processes need to be in place such as run tracking and collaboration, automated and full attribution (a comprehensive record of all the steps taken to create an AI model) of AI model deployments and model health tracking throughout the AI lifecycle.
“Artificial Intelligence has the possible to reinvent the global economy, but as a discipline it’s the Wild West out near,” said Luke Marsden, founder and CEO at Dotscience.
“We’ve seen destructive levels of chaos and pain in efforts to operationalize AI due to inadequate tooling and ad-hoc processes. The lessons educated from DevOps sorely need to be applied to ML.”
Company CEO Luke Marsden said that as organizations understand that AI models need to be trained and updated continuously, the need for a DevOps platform that hurries that process will become more apparent. Today most AI models are trained over a prolonged amount of time and then deployed within an application environment. Over time, however, either more data becomes accessible or organizations determine the machine and deep knowledge algorithms originally used to create the AI model need to be updated or substituted. Whatever the underlying reason for replacing an AI model, a platform that addresses all the features of the AI model life cycle, including testing, reproducibility, accountability, collaboration and continuous delivery, is compulsory, he said.
Suggested Read: DevOps key practices
To address those requirements, the Dotscience platform enables concurrent collaboration across developer and operations teams, version control of the model formation process, tracking of provenance records in real-time, discovering and optimizing hyper-parameters when training a model and tracking workflows across manifold open source tools.
Most of the teams that build AI models have backgrounds in data science versus application development. As such, Marsden noted most of them have had little to no acquaintance to DevOps practices. In that absence of those processes, teams building AI models now regularly encounter issues such as having to direct siloed data and technical debt, which all conspire to extend the time required to build AI model, Marsden said. In addition, teams building AI models need to keep track of not only versions, but also runs of their code that tie collected input data with models and corresponding hyper parameters and metrics, he added
In the absence of a DevOps platform such as Dotscience, it becomes stimulating for organizations to document what changes were made to an AI model when, Marsden said. That supremacy issue has become especially difficult when it comes to AI models because organizations are coming under increased regulatory compression to document how the AI models they are employing are built and updated.
Dotscience’s “The State of Development and Operations of AI Applications” report also published today identifies the top three encounters with AI workloads are duplicating work (33%), rewriting a model after a team member leaves (27.6%) and modifying its value (27%). Based on a survey of 500 industry professionals, the report also finds 52% of respondents track attribution manually using tools such as spreadsheets, while 27% don’t track attribution at all but think it is important. In total, the survey finds 63% of businesses report they are expenditure between $500,000 and $10 million on their AI efforts.
As more organizations depend on on DevOps processes to build and deploy applications, there’s no doubt that teams building AI models that need to be introduced into those applications will have to fall in line with best DevOps practices. That challenge now is finding a way to spread those DevOps procedures all the way back to the building of the AI models themselves. Only then is the AI model building and deployment process likely to become agile sufficient to stay relevant to the pace of change now happening across digital business processes.