<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>AI安全 on 程式猿 AFA 的隨手筆記</title>
    <link>https://appfromape.com/tags/ai%E5%AE%89%E5%85%A8/</link>
    <description>Recent content in AI安全 on 程式猿 AFA 的隨手筆記</description>
    <generator>Hugo -- 0.147.7</generator>
    <language>zh-tw</language>
    <lastBuildDate>Sat, 11 Apr 2026 19:30:00 +0800</lastBuildDate>
    <atom:link href="https://appfromape.com/tags/ai%E5%AE%89%E5%85%A8/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>連 Anthropic 都不敢公開的 AI：Project Glasswing 到底有多危險？</title>
      <link>https://appfromape.com/posts/anthropic_project_glasswing/</link>
      <pubDate>Sat, 11 Apr 2026 19:30:00 +0800</pubDate>
      <guid>https://appfromape.com/posts/anthropic_project_glasswing/</guid>
      <description>當 AI 能力變強，安全評估、公開透明與濫用風險會變得更重要。這篇整理 Project Glasswing 相關討論的觀察角度。</description>
    </item>
    <item>
      <title>你每天都在用的 AI 工具，底下藏著一條沒人在守的供應鏈</title>
      <link>https://appfromape.com/posts/ai_tool_supply_chain_risk/</link>
      <pubDate>Fri, 10 Apr 2026 19:30:22 +0800</pubDate>
      <guid>https://appfromape.com/posts/ai_tool_supply_chain_risk/</guid>
      <description>AI 工具不是只有介面和模型，背後還有套件、API、資料來源與部署環境。這篇整理 AI 供應鏈風險的基本觀念。</description>
    </item>
  </channel>
</rss>
