Users are raising concerns about Weights & Biases' new Master Service Agreement, which grants the company broader rights to use customer data, including ML models, for product improvement and AI feature development.
\*\*Update: my questions have been escalated to their teams. I'll share their answers (& hopefully reassurance) here.\*\* Weights & Biases sent an email yesterday, saying their new Master Service Agreement takes effect May 11th. I use & love wandb, but I'm concerned about the changes. I wanted to start a discussion. I sent them an email, but I think I'm too small to hear back. How do you interpret these changes? Do you worry about intellectual property rights? Do you need an enterprise contract for true protection? Weights & Biases defines Customer Data as "any data, content or material that Customer (including its Authorized Users) inputs into the Software or Service, \*including machine learning models and deep learning research projects, and any visualizations, analyses, and other reports generated by the Software or Service.\*" 1. Who Owns Your Research? In the prior agreement, Section 8(b) made this clear: \> As between the parties, \*Customer owns and retains all right, title and interest in and to the Customer Data.\* Except for the rights granted to W&B in Section 4(a), Customer does not by means of this Agreement or otherwise transfer any other rights to W&B. The new agreement deletes these statements entirely. Customer Data is added to Section 6(e), meaning it survives after terminating a subscription. 2. How can Weights & Biases use your data? In the prior agreement: "Customer may transfer Customer Data to W&B and W&B may use Customer Data \*to provide the Software and Service\*. Customer grants W&B a limited right during each Subscription Term to use Customer Data in accordance with this Agreement, the DPA and BAA (as applicable). In the new agreement: "Customer may transfer Customer Data to W&B and Customer grants W&B the right to use Customer Data to (i) provide and improve the W&B Assets, \*(ii) develop new product offerings\*, and \*(iii) for the purposes of providing and improving AI Features\*. Customer grants W&B a limited right to use Customer Data in accordance with this Agreement, the DPA and BAA (as applicable). There's now an explicit callout for using Customer Data (models, logs, reports, etc.) to train AI, and there's no acknowledgement of an opt-out system. The agreement does say "W&B may use Customer Data from free and academic customers for testing and development purposes." But then it fails to differentiate treatment for Pro and Enterprise customer data. The prior agreement is available on Wayback Machine here: [https://web.archive.org/web/20260227104844/https://wandb.ai/site/terms/](https://web.archive.org/web/20260227104844/https://wandb.ai/site/terms/)
The article argues that the trend of open-weights AI models becoming more restrictive poses a threat to market competition, as these models currently provide essential price discipline and privacy options against frontier closed-model providers.
Meta is mandating AI-training software on US employees’ work laptops that logs keystrokes and mouse movements, prompting internal backlash over privacy despite company claims of safeguards.
This issue covers a new open-weights AI leader, AI's growing political influence, using AI to predict illness, and faster reasoning models. Andrew Ng also discusses AI's potential to create new jobs and his personal use of AI agents.