The token used by Microsoft not only allowed access to additional storage accidentally through wide access scope, but it also carried misconfigurations that allowed “full control” permissions instead of read-only, enabling a possible attacker to not just view the private files but to delete or overwrite existing files as well.
In Azure, a SAS token is a signed URL granting customizable access to Azure Storage data, with permissions ranging from read-only to full control. It can cover a single file, container, or entire storage account, and the user can set an optional expiration time, even setting it to never expire.
The full-access configuration “is particularly interesting considering the repository’s original purpose: providing AI models for use in training code,” Wiz said. The format of the model data file intended for downloading is ckpt, a format produced by the TensorFlow library. “It’s formatted using Python’s Pickle formatter, which is prone to arbitrary code execution by design. Meaning, an attacker could have (also) injected malicious code into all the AI models in this storage account,” Wiz added.
SAS tokens are difficult to manage
The granularity of SAS tokens opens up risks of granting too much access. In the Microsoft GitHub case, the token allowed full control of permissions, on the entire account, forever.
Microsoft’s repository used an Account SAS token — one of three types of SAS tokens that also include Service SAS, and User Delegation SAS — to allow service (application) and user access, respectively.
Account SAS tokens are extremely risky as they are vulnerable in terms of permissions, hygiene, management, and monitoring, Wiz noted. Permissions on SAS tokens can grant high level access to storage accounts either through excessive permissions, or through wide access scopes.