<?xml version="1.0" encoding="utf-8" standalone="yes"?>
<rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom" xmlns:content="http://purl.org/rss/1.0/modules/content/">
  <channel>
    <title>RecSys Series on Abdullah Al Mamun</title>
    <link>https://pwaabdullah.github.io/categories/recsys-series/</link>
    <description>Recent content in RecSys Series on Abdullah Al Mamun</description>
    
    <generator>Hugo</generator>
    <language>en</language>
    <copyright>newabdullah 2025</copyright>
    <lastBuildDate>Wed, 12 Mar 2025 00:00:00 +0000</lastBuildDate>
    <atom:link href="https://pwaabdullah.github.io/categories/recsys-series/index.xml" rel="self" type="application/rss+xml" />
    <item>
      <title>The Evaluation of RecSys — Part 3: The Deep Learning Era (NCF, Wide &amp; Deep, DeepFM, DIN, DLRM, AdaTT)</title>
      <link>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-3/</link>
      <pubDate>Wed, 12 Mar 2025 00:00:00 +0000</pubDate>
      <guid>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-3/</guid>
      <description>Part 3 of the RecSys series. Traces the deep-learning revolution in RecSys from 2016 to 2023 — Neural Collaborative Filtering, Wide &amp;amp; Deep, DeepFM, Deep Interest Network, DLRM, and AdaTT. Architectures, intuition, where each one wins.</description>
    </item>
    <item>
      <title>The Evaluation of RecSys — Part 2: Factorization Machines and XGBoost</title>
      <link>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-2/</link>
      <pubDate>Tue, 11 Mar 2025 00:00:00 +0000</pubDate>
      <guid>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-2/</guid>
      <description>Part 2 of the RecSys series. Factorization Machines generalize matrix factorization to arbitrary feature spaces, and XGBoost brings non-linear ranking via gradient-boosted trees. We cover the math, loss functions, strengths, and the limitations that drove the field toward deep learning.</description>
    </item>
    <item>
      <title>The Evaluation of RecSys — Part 1: From Content-Based Filtering to Matrix Factorization</title>
      <link>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-1/</link>
      <pubDate>Sat, 01 Mar 2025 00:00:00 +0000</pubDate>
      <guid>https://pwaabdullah.github.io/posts/the-evaluation-of-recsys-part-1/</guid>
      <description>Part 1 of a deep-dive series on the evolution of recommendation systems. Covers content-based filtering, collaborative filtering (user/item), and matrix factorization — with loss functions, intuition, and where each technique breaks down.</description>
    </item>
  </channel>
</rss>
