All Issues
Apr 05 - Apr 11, 2026

AI Weekly: Mythos Escapes Its Sandbox, Meta Abandons Open Source

Models & Releases

4 stories

Meta Muse Spark: First Closed Model from Superintelligence Labs

  • Meta's newly formed Superintelligence Labs (MSL) debuted Muse Spark, its first model — and notably, a closed proprietary one.
  • It uses 10x less compute than Llama 4 Maverick and powers Meta AI across WhatsApp, Instagram, Facebook, Messenger, and AI glasses.
  • The pivot away from open-source Llama marks a landmark shift in Meta's AI strategy, driven by the Alexandr Wang $14B deal.

GLM-5.1: 744B Open-Source MoE Tops SWE-Bench Pro

  • Z.ai (formerly Zhipu AI) released GLM-5.1, a post-training upgrade to GLM-5 built on a 744B parameter MoE with 40B active params.
  • It topped SWE-Bench Pro for coding and is designed for long-horizon agentic tasks with improved handling of ambiguous problems.
  • Released under MIT licence with commercial use allowed — from the same team behind GLM-OCR (covered Mar 21).

Gemma 4 31B Beats GPT-5.2 on Agentic Benchmark at $0.20/Run

  • Google's Gemma 4 31B achieved 100% survival and +1,144% median ROI on FoodTruck Bench, an agentic business simulation benchmark.
  • It outperformed GPT-5.2 ($4.43/run), Sonnet 4.6 ($7.90), and Gemini 3 Pro ($2.95) — at just $0.20 per run.
  • Only Opus 4.6 scored higher, at $36/run — 180x more expensive — making Gemma 4 31B the best cost-performance open model yet.

People & Business

3 stories

OpenAI Launches $100 ChatGPT Pro Tier with 5x Codex Limits

  • A new mid-range $100/month plan sits between the $20 Plus and $200 Pro tiers, aimed squarely at professional developers.
  • It offers 5x higher Codex usage limits, directly responding to Anthropic's recent subscription restrictions on third-party harnesses.
  • Context: OpenAI acquired OpenClaw in February 2026 and is aggressively pushing Codex as its primary coding agent product.

OpenAI Acquires TBPN Podcast Network

  • OpenAI acquired TBPN, a tech and business podcast network, in a quiet media strategy move announced April 2.
  • The acquisition suggests OpenAI is building owned media infrastructure alongside its product expansion into coding and enterprise.
  • No financial terms disclosed; TBPN will continue operating its existing shows under the OpenAI umbrella.

Policy & Ethics

3 stories

80% of White-Collar Workers Refusing AI Adoption Mandates

  • A Fortune survey finds 80% of white-collar employees are outright refusing corporate mandates to adopt AI tools.
  • Researchers attribute it to FOBO — Fear of Becoming Obsolete — driving a quiet rebellion that mirrors the 2022 'quiet quitting' wave.
  • The finding complicates enterprise AI ROI narratives and suggests the bottleneck is cultural adoption, not model capability.

OpenAI Publishes Child Safety Blueprint

  • OpenAI released a formal Child Safety Blueprint on April 8, setting out policy commitments across its consumer and API products.
  • The framework covers detection, reporting, and safeguard requirements for AI-generated content involving minors.
  • Timed alongside Glasswing's launch, it reflects growing pressure on frontier labs to publish explicit safety policies.

Products & Hardware

3 stories

PraisonAI: Production Multi-Agent Framework Hits #1 GitHub Trending

  • PraisonAI is a low-code production multi-agent framework supporting 100+ LLM providers with built-in RAG, memory, and guardrails.
  • Agent teams can plan, research, code, and deliver results directly to Telegram, WhatsApp, and Discord around the clock.
  • It hit #1 on GitHub Trending this week, signalling strong developer appetite for production-ready agentic infrastructure.

OpenAI Outlines Next Phase of Enterprise AI

  • OpenAI published its enterprise AI strategy on April 8, focused on expanding agentic and Codex-powered offerings for businesses.
  • The announcement coincides with the new $100 Pro tier and follows the company's February acquisition of OpenClaw.
  • Enterprise is now OpenAI's primary growth vector as consumer ChatGPT growth plateaus at $25B ARR.

Research & Resources

2 stories

LLM Running on a 1998 iMac G3 with 32MB RAM

  • Developer maddiedreese cross-compiled Karpathy's 260K-parameter TinyStories model (1MB) to run on PowerPC Mac OS 8.5.
  • The project required no OS X, no modern toolchain — just a 1998 iMac G3 with 32MB RAM and a hand-ported inference engine.
  • A delightful reminder that LLM inference is fundamentally just matrix multiplication — and matrix multiplication is old.