With businesses increasingly relying on a host of AI implementations within their services, JFrog is trying to respond to the need for a central management system to bring AI deliveries in line with an organization’s existing DevOps practices.
Dubbed “ML model management,” JFrog’s new capabilities are introduced within the JFrog software supply chain platform to manage an organization’s local and open source ML models and ensure the security of those models through the software development lifecycle (SDLC).
“As the creator of Artifactory — the industry’s leading technology for easily storing, managing, and securing binaries — it’s only natural we’re proud to bring another advanced type of binary — ML models — into a unified software supply chain platform to help customers rapidly deliver trusted software at scale,” said Yoav Landman, chief technology officer and co-founder of JFrog.
JFrog has announced adding another DevOps functionality, Release Lifecycle Management (RLM), along with a suite of new security capabilities in the JFrog platform.
JFrog platform receives DevOps boost
JFrog has added two new DevOps functionalities — Release Lifecycle Management (RLM) and ML model management.
RLM allows organizations to create an immutable “Release bundle” that defines a potential release and its components early in the software development lifecycle. The capability uses anti-tampering systems, compliance checks, and evidence capture to collect data and insights on each release bundle at every stage of the SDLC, according to Landman.