Rami's Readings #113 - The Illusion of Thinking
The latest on AI, LLMs, Apple's Paper The Illusion of Thinking, Qwen3 Embedding Model, Japan's Shisa LLM, Tax Code, Chinese Purple, and more.
Welcome to Rami’s Readings #113 - a weekly digest of interesting articles, papers, videos, and X threads from my various sources across the Internet. Expect a list of reads covering AI, technology, business, culture, fashion, travel, and more. Learn about what I do at ramisayar.com/about.
👋🏼 Welcome New Subscribers
Hello! A hearty thank you for subscribing to Rami's Readings! There are quite a few new subscribers this week, thanks to a recommendation from The AI Ethics Brief. I am thrilled to have you on board! In this newsletter, I curate the best papers, tweets, and articles I have read during the week focusing on LLMs, AI, economics, business, and technology news. You can learn more about me on my website.
📈 Top Recent Editions According to Substack
🤖 AI Reads
The Illusion of Thinking: Understanding the Strengths and Limitations of Reasoning Models via the Lens of Problem Complexity
Notes: Incredible paper from Apple! One of the authors is Samy Bengio (brother of Yoshua Bengio) 🍁 TLDR: Models aren’t really developing generalizable reasoning and training data is contaminated.
Through extensive experimentation across diverse puzzles, we show that frontier LRMs face a complete accuracy collapse beyond certain complexities. Moreover, they exhibit a counter- intuitive scaling limit: their reasoning effort increases with problem complexity up to a point, then declines despite having an adequate token budget.
Qwen3 Embedding: Advancing Text Embedding and Reranking Through Foundation Models
Notes: The smallest embedding model is just 0.6B.
Manus Has Kick-Started an AI Agent Boom in China
Notes: From the MIT Technology Review. The manus hype went down in the West Coast, but I am curious to hear from friends abroad.
Shisa V2 405B: Japan’s Highest Performing LLM
Notes: A new competitor to Sakana? Anyone know anything about them?
💼 Business Reads
The Tax Code Change That's Fueling Mass Tech Layoffs
Notes: I heard about this tax code change in 2018 and promptly forgot about it, because I wrongly assumed it wouldn’t persist. Effectively, US technology companies are forced to amortize software research & development costs over 5 years instead of immediately in the tax year it is incurred. It makes software (only!) R&D more expensive than any other R&D. This affects all R&D before technological feasibility at which point the remaining costs are amortized per ASC 985-20.
Quant Firm’s $1 Billion Code Is Focus of Rare Criminal Case
Notes: Interesting legal case to keep an eye on if you’re in high-end software development or machine learning.
🔀 Other Reads
Falsehoods Programmers Believe About Aviation
Notes: Anyone want to collaborate on a similar article about machine learning?
TTArtisan On-Camera Flash & Wireless Trigger
Notes: Yes to more vintage and retro camera gear!
The Case for Monogramming Everything
Notes: I used to be completely against monogramming, but now I monogram items I’d never consider reselling—like custom shirts (the only person I know with similar measurements is my brother), my backpack (which lasts a decade and is easily confused with others), and checked luggage (much easier to spot in a sea of bags and helpful for airline recovery if lost).
The Lost Art of Chinese Purple
Notes: Reminds me of the quest to recreate Tyrian purple.
Signing off from Redmond, WA.