<rss version="2.0">
  <channel>
    <title>Meet Gor - Tag: advent-of-sql</title>
    <link>https://www.meetgor.com</link>
    <description>Posts tagged with advent-of-sql</description>
    <language>en-us</language>
    <pubDate>Sat, 18 Apr 2026 05:00:11 UTC</pubDate>
    <item>
      <title>Advent of SQL 2025 Day 15: Confirmation Phrase Dispatches</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-15</link>
      <description>Advent of SQL, Day 15 - Confirmation Phrase Dispatches We are on the final day of Advent of SQL! I can&#39;t believe it, I completed it! (with some help of course,</description>
      <pubDate>Sun, 28 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL, Day 15 - Confirmation Phrase Dispatches&#xA;&#xA;We are on the final day of Advent of SQL!&#xA;&#xA;I can&#39;t believe it, I completed it! (with some help of course, I ~~don&#39;t~~ didn&#39;t know SQL very well. But this 15 days flipped it around.). I feel good and fresh to start the year to go deep in SQLite and databases!&#xA;&#xA;Let&#39;s solve the last day first, if it has somethings reamining to teach us!&#xA;&#xA;We need to make a few changes.&#xA;&#xA;SQLite doesn&#39;t have `-&gt;&gt;` operator for extracting JSON values from the columns, so we need to use `json_extract` instead.&#xA;&#xA;So, replace this 9th line from and with:&#xA;&#xA;```diff&#xA;-    marker_letter TEXT GENERATED ALWAYS AS (payload -&gt;&gt; &#39;marker&#39;) STORED,&#xA;+    marker_letter TEXT GENERATED ALWAYS AS (json_extract(payload, &#39;$.marker&#39;)) STORED,&#xA;```&#xA;&#xA;Once we have it, we also need to remove the marker `::jsonb` from the each insert row, we can do that with `sed` or any other utility you like.&#xA;&#xA;```&#xA;sed &#34;s/&#39;\(.*\)&#39;::jsonb/&#39;\1&#39;/g&#34; day15-inserts.sql &gt; day15-inserts-sqlite.sql&#xA;```&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS incoming_dispatches;&#xA;DROP TABLE IF EXISTS system_dispatches;&#xA;&#xA;CREATE TABLE system_dispatches (&#xA;    id SERIAL PRIMARY KEY,&#xA;    system_id TEXT NOT NULL,&#xA;    dispatched_at TIMESTAMP NOT NULL,&#xA;    payload JSONB NOT NULL,&#xA;    marker_letter TEXT GENERATED ALWAYS AS (json_extract(payload, &#39;$.marker&#39;)) STORED,&#xA;    UNIQUE (system_id, dispatched_at, payload)&#xA;);&#xA;&#xA;CREATE TABLE incoming_dispatches (&#xA;    system_id TEXT,&#xA;    dispatched_at TIMESTAMP,&#xA;    payload JSONB&#xA;);&#xA;&#xA;INSERT INTO system_dispatches (system_id, dispatched_at, payload) VALUES&#xA;(&#39;SYS-0081&#39;, &#39;2025-12-21T06:02:26&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0137&#39;, &#39;2025-12-19T06:03:21&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0237&#39;, &#39;2025-12-19T06:23:37&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0006&#39;, &#39;2025-12-24T18:10:16&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0170&#39;, &#39;2025-12-19T06:17:24&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0224&#39;, &#39;2025-12-19T06:23:24&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0007&#39;, &#39;2025-12-24T18:10:06&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0035&#39;, &#39;2025-12-23T15:55:34&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0010&#39;, &#39;2025-12-23T15:55:09&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0037&#39;, &#39;2025-12-23T15:55:36&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0001&#39;, &#39;2025-12-24T08:02:00&#39;, &#39;{&#34;marker&#34;: &#34;X&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0225&#39;, &#39;2025-12-19T06:23:25&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0095&#39;, &#39;2025-12-21T06:02:40&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0021&#39;, &#39;2025-12-23T15:55:20&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0001&#39;, &#39;2025-12-24T08:10:00&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-24T08:02:08&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0119&#39;, &#39;2025-12-19T06:03:03&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0142&#39;, &#39;2025-12-19T06:03:26&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0048&#39;, &#39;2025-12-23T15:55:30&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0228&#39;, &#39;2025-12-19T06:23:28&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0092&#39;, &#39;2025-12-21T06:02:37&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0008&#39;, &#39;2025-12-24T08:02:07&#39;, &#39;{&#34;marker&#34;: &#34;B&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0111&#39;, &#39;2025-12-19T06:02:55&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0106&#39;, &#39;2025-12-19T06:02:50&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0082&#39;, &#39;2025-12-21T06:02:27&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0180&#39;, &#39;2025-12-19T06:17:31&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0103&#39;, &#39;2025-12-19T06:02:47&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0096&#39;, &#39;2025-12-21T06:02:41&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0003&#39;, &#39;2025-12-23T06:02:02&#39;, &#39;{&#34;marker&#34;: &#34;Z&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0157&#39;, &#39;2025-12-19T06:17:11&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0010&#39;, &#39;2025-12-23T06:02:09&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0102&#39;, &#39;2025-12-19T06:02:46&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0072&#39;, &#39;2025-12-23T06:02:18&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0175&#39;, &#39;2025-12-19T06:17:29&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0214&#39;, &#39;2025-12-19T06:23:14&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0005&#39;, &#39;2025-12-24T08:02:04&#39;, &#39;{&#34;marker&#34;: &#34;P&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0216&#39;, &#39;2025-12-19T06:23:16&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0047&#39;, &#39;2025-12-23T15:55:46&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0019&#39;, &#39;2025-12-23T15:55:18&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0043&#39;, &#39;2025-12-23T15:55:42&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0039&#39;, &#39;2025-12-23T15:55:38&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0212&#39;, &#39;2025-12-19T06:23:12&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0133&#39;, &#39;2025-12-19T06:03:17&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0005&#39;, &#39;2025-12-24T08:10:04&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0011&#39;, &#39;2025-12-24T08:02:10&#39;, &#39;{&#34;marker&#34;: &#34;J&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0080&#39;, &#39;2025-12-21T06:02:25&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0230&#39;, &#39;2025-12-19T06:23:30&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0100&#39;, &#39;2025-12-21T06:02:45&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0061&#39;, &#39;2025-12-23T06:02:07&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0022&#39;, &#39;2025-12-23T15:55:21&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0149&#39;, &#39;2025-12-19T06:17:03&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-23T15:55:08&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-24T08:02:01&#39;, &#39;{&#34;marker&#34;: &#34;Y&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0132&#39;, &#39;2025-12-19T06:03:16&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0033&#39;, &#39;2025-12-23T15:55:32&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0193&#39;, &#39;2025-12-19T06:17:44&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0167&#39;, &#39;2025-12-19T06:17:21&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0008&#39;, &#39;2025-12-24T08:10:07&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0186&#39;, &#39;2025-12-19T06:17:37&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0078&#39;, &#39;2025-12-21T06:02:23&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0085&#39;, &#39;2025-12-21T06:02:30&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0046&#39;, &#39;2025-12-23T15:55:45&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0054&#39;, &#39;2025-12-23T15:55:36&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0235&#39;, &#39;2025-12-19T06:23:35&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0089&#39;, &#39;2025-12-21T06:02:34&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0011&#39;, &#39;2025-12-23T06:02:10&#39;, &#39;{&#34;marker&#34;: &#34;J&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0034&#39;, &#39;2025-12-23T15:55:33&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0129&#39;, &#39;2025-12-19T06:03:13&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0211&#39;, &#39;2025-12-19T06:23:11&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0151&#39;, &#39;2025-12-19T06:17:05&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0204&#39;, &#39;2025-12-19T06:23:04&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0213&#39;, &#39;2025-12-19T06:23:13&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0147&#39;, &#39;2025-12-19T06:03:31&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0140&#39;, &#39;2025-12-19T06:03:24&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0123&#39;, &#39;2025-12-19T06:03:07&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0052&#39;, &#39;2025-12-23T15:55:34&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0148&#39;, &#39;2025-12-19T06:17:02&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0059&#39;, &#39;2025-12-23T06:02:05&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0027&#39;, &#39;2025-12-23T15:55:26&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0055&#39;, &#39;2025-12-23T15:55:37&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0155&#39;, &#39;2025-12-19T06:17:09&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0107&#39;, &#39;2025-12-19T06:02:51&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0173&#39;, &#39;2025-12-19T06:17:27&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0007&#39;, &#39;2025-12-23T06:02:06&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0016&#39;, &#39;2025-12-23T15:55:15&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0168&#39;, &#39;2025-12-19T06:17:22&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0196&#39;, &#39;2025-12-19T06:17:47&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0234&#39;, &#39;2025-12-19T06:23:34&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0210&#39;, &#39;2025-12-19T06:23:10&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0135&#39;, &#39;2025-12-19T06:03:19&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0150&#39;, &#39;2025-12-19T06:17:04&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0091&#39;, &#39;2025-12-21T06:02:36&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0154&#39;, &#39;2025-12-19T06:17:08&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0223&#39;, &#39;2025-12-19T06:23:23&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0020&#39;, &#39;2025-12-23T15:55:19&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0160&#39;, &#39;2025-12-19T06:17:14&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0128&#39;, &#39;2025-12-19T06:03:12&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0194&#39;, &#39;2025-12-19T06:17:45&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0231&#39;, &#39;2025-12-19T06:23:31&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0045&#39;, &#39;2025-12-23T15:55:44&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0190&#39;, &#39;2025-12-19T06:17:41&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0197&#39;, &#39;2025-12-19T06:17:48&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-23T06:02:01&#39;, &#39;{&#34;marker&#34;: &#34;Y&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0008&#39;, &#39;2025-12-24T18:10:07&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0158&#39;, &#39;2025-12-19T06:17:12&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0003&#39;, &#39;2025-12-24T08:10:02&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0171&#39;, &#39;2025-12-19T06:17:25&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0229&#39;, &#39;2025-12-19T06:23:29&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0079&#39;, &#39;2025-12-21T06:02:24&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0215&#39;, &#39;2025-12-19T06:23:15&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0233&#39;, &#39;2025-12-19T06:23:33&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0060&#39;, &#39;2025-12-23T06:02:06&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0026&#39;, &#39;2025-12-23T15:55:25&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0125&#39;, &#39;2025-12-19T06:03:09&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0001&#39;, &#39;2025-12-24T18:10:00&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0136&#39;, &#39;2025-12-19T06:03:20&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0010&#39;, &#39;2025-12-24T08:02:09&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0010&#39;, &#39;2025-12-24T08:10:09&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0044&#39;, &#39;2025-12-23T15:55:43&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0115&#39;, &#39;2025-12-19T06:02:59&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0005&#39;, &#39;2025-12-24T18:10:15&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0094&#39;, &#39;2025-12-21T06:02:39&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0191&#39;, &#39;2025-12-19T06:17:42&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0182&#39;, &#39;2025-12-19T06:17:33&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0195&#39;, &#39;2025-12-19T06:17:46&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0227&#39;, &#39;2025-12-19T06:23:27&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0032&#39;, &#39;2025-12-23T15:55:31&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0177&#39;, &#39;2025-12-19T06:17:31&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0068&#39;, &#39;2025-12-23T06:02:14&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0069&#39;, &#39;2025-12-23T06:02:15&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-23T06:02:08&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0176&#39;, &#39;2025-12-19T06:17:30&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0008&#39;, &#39;2025-12-23T06:02:07&#39;, &#39;{&#34;marker&#34;: &#34;B&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0099&#39;, &#39;2025-12-21T06:02:44&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0014&#39;, &#39;2025-12-23T15:55:13&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0042&#39;, &#39;2025-12-23T15:55:41&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0006&#39;, &#39;2025-12-24T08:02:05&#39;, &#39;{&#34;marker&#34;: &#34;R&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0187&#39;, &#39;2025-12-19T06:17:38&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0174&#39;, &#39;2025-12-19T06:17:28&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0056&#39;, &#39;2025-12-23T15:55:55&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0199&#39;, &#39;2025-12-19T06:17:50&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0120&#39;, &#39;2025-12-19T06:03:04&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0003&#39;, &#39;2025-12-24T08:02:02&#39;, &#39;{&#34;marker&#34;: &#34;Z&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0116&#39;, &#39;2025-12-19T06:02:59&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0008&#39;, &#39;2025-12-24T18:10:18&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0205&#39;, &#39;2025-12-19T06:23:05&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0006&#39;, &#39;2025-12-24T08:10:05&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0145&#39;, &#39;2025-12-19T06:03:29&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0004&#39;, &#39;2025-12-24T08:10:03&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0065&#39;, &#39;2025-12-23T06:02:11&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0086&#39;, &#39;2025-12-21T06:02:31&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0188&#39;, &#39;2025-12-19T06:17:39&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0183&#39;, &#39;2025-12-19T06:17:34&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0062&#39;, &#39;2025-12-23T06:02:08&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0058&#39;, &#39;2025-12-23T15:55:57&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0169&#39;, &#39;2025-12-19T06:17:23&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0105&#39;, &#39;2025-12-19T06:02:49&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0001&#39;, &#39;2025-12-24T18:10:11&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0005&#39;, &#39;2025-12-23T06:02:04&#39;, &#39;{&#34;marker&#34;: &#34;P&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0131&#39;, &#39;2025-12-19T06:03:15&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0232&#39;, &#39;2025-12-19T06:23:32&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0028&#39;, &#39;2025-12-23T15:55:27&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0070&#39;, &#39;2025-12-23T06:02:16&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0029&#39;, &#39;2025-12-23T15:55:28&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0127&#39;, &#39;2025-12-19T06:03:11&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0010&#39;, &#39;2025-12-24T18:10:09&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0025&#39;, &#39;2025-12-23T15:55:24&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0164&#39;, &#39;2025-12-19T06:17:18&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0114&#39;, &#39;2025-12-19T06:02:58&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-24T18:10:08&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0184&#39;, &#39;2025-12-19T06:17:35&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0003&#39;, &#39;2025-12-24T18:10:02&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0053&#39;, &#39;2025-12-23T15:55:35&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0117&#39;, &#39;2025-12-19T06:03:01&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0074&#39;, &#39;2025-12-23T06:02:20&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0198&#39;, &#39;2025-12-19T06:17:49&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0036&#39;, &#39;2025-12-23T15:55:35&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0112&#39;, &#39;2025-12-19T06:02:56&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0023&#39;, &#39;2025-12-23T15:55:22&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0121&#39;, &#39;2025-12-19T06:03:05&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0181&#39;, &#39;2025-12-19T06:17:32&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0003&#39;, &#39;2025-12-24T18:10:13&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0130&#39;, &#39;2025-12-19T06:03:14&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0066&#39;, &#39;2025-12-23T06:02:12&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0203&#39;, &#39;2025-12-19T06:17:54&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-24T18:10:01&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0104&#39;, &#39;2025-12-19T06:02:48&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0017&#39;, &#39;2025-12-23T15:55:16&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0004&#39;, &#39;2025-12-24T18:10:03&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0238&#39;, &#39;2025-12-19T06:23:38&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0110&#39;, &#39;2025-12-19T06:02:54&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0071&#39;, &#39;2025-12-23T06:02:17&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0004&#39;, &#39;2025-12-24T18:10:14&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0031&#39;, &#39;2025-12-23T15:55:30&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0004&#39;, &#39;2025-12-23T06:02:03&#39;, &#39;{&#34;marker&#34;: &#34;M&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0141&#39;, &#39;2025-12-19T06:03:25&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0207&#39;, &#39;2025-12-19T06:23:07&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0122&#39;, &#39;2025-12-19T06:03:06&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0006&#39;, &#39;2025-12-24T18:10:05&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0030&#39;, &#39;2025-12-23T15:55:29&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0192&#39;, &#39;2025-12-19T06:17:43&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0144&#39;, &#39;2025-12-19T06:03:28&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0098&#39;, &#39;2025-12-21T06:02:43&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0200&#39;, &#39;2025-12-19T06:17:51&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0007&#39;, &#39;2025-12-24T08:02:06&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0220&#39;, &#39;2025-12-19T06:23:20&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0063&#39;, &#39;2025-12-23T06:02:09&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0109&#39;, &#39;2025-12-19T06:02:53&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0179&#39;, &#39;2025-12-19T06:17:30&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0163&#39;, &#39;2025-12-19T06:17:17&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0162&#39;, &#39;2025-12-19T06:17:16&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0222&#39;, &#39;2025-12-19T06:23:22&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0134&#39;, &#39;2025-12-19T06:03:18&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0004&#39;, &#39;2025-12-24T08:02:03&#39;, &#39;{&#34;marker&#34;: &#34;M&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0139&#39;, &#39;2025-12-19T06:03:23&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0097&#39;, &#39;2025-12-21T06:02:42&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0146&#39;, &#39;2025-12-19T06:03:30&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0101&#39;, &#39;2025-12-19T06:02:45&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0201&#39;, &#39;2025-12-19T06:17:52&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0051&#39;, &#39;2025-12-23T15:55:33&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0172&#39;, &#39;2025-12-19T06:17:26&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0083&#39;, &#39;2025-12-21T06:02:28&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-24T08:10:08&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0124&#39;, &#39;2025-12-19T06:03:08&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0041&#39;, &#39;2025-12-23T15:55:40&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0049&#39;, &#39;2025-12-23T15:55:31&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0202&#39;, &#39;2025-12-19T06:17:53&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0178&#39;, &#39;2025-12-19T06:17:29&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0090&#39;, &#39;2025-12-21T06:02:35&#39;, &#39;{&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0006&#39;, &#39;2025-12-23T06:02:05&#39;, &#39;{&#34;marker&#34;: &#34;R&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0073&#39;, &#39;2025-12-23T06:02:19&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0076&#39;, &#39;2025-12-23T06:02:22&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0217&#39;, &#39;2025-12-19T06:23:17&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0159&#39;, &#39;2025-12-19T06:17:13&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0161&#39;, &#39;2025-12-19T06:17:15&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0221&#39;, &#39;2025-12-19T06:23:21&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0236&#39;, &#39;2025-12-19T06:23:36&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0064&#39;, &#39;2025-12-23T06:02:10&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0126&#39;, &#39;2025-12-19T06:03:10&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0024&#39;, &#39;2025-12-23T15:55:23&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0113&#39;, &#39;2025-12-19T06:02:57&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0038&#39;, &#39;2025-12-23T15:55:37&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0075&#39;, &#39;2025-12-23T06:02:21&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0088&#39;, &#39;2025-12-21T06:02:33&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0007&#39;, &#39;2025-12-24T08:10:06&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0067&#39;, &#39;2025-12-23T06:02:13&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0156&#39;, &#39;2025-12-19T06:17:10&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0040&#39;, &#39;2025-12-23T15:55:39&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0219&#39;, &#39;2025-12-19T06:23:19&#39;, &#39;{&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0166&#39;, &#39;2025-12-19T06:17:20&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0007&#39;, &#39;2025-12-24T18:10:17&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0153&#39;, &#39;2025-12-19T06:17:07&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0077&#39;, &#39;2025-12-23T06:02:23&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0084&#39;, &#39;2025-12-21T06:02:29&#39;, &#39;{&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;);&#xA;&#xA;INSERT INTO incoming_dispatches (system_id, dispatched_at, payload) VALUES&#xA;(&#39;SYS-0013&#39;, &#39;2025-12-23T15:55:12&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-24T08:10:01&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0012&#39;, &#39;2025-12-23T15:55:11&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0108&#39;, &#39;2025-12-19T06:02:52&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0165&#39;, &#39;2025-12-19T06:17:19&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0189&#39;, &#39;2025-12-19T06:17:40&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0050&#39;, &#39;2025-12-23T15:55:32&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0138&#39;, &#39;2025-12-19T06:03:22&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0087&#39;, &#39;2025-12-21T06:02:32&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0208&#39;, &#39;2025-12-19T06:23:08&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0057&#39;, &#39;2025-12-23T15:55:56&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-24T18:10:12&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0011&#39;, &#39;2025-12-24T08:10:10&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0226&#39;, &#39;2025-12-19T06:23:26&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0206&#39;, &#39;2025-12-19T06:23:06&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0118&#39;, &#39;2025-12-19T06:03:02&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0185&#39;, &#39;2025-12-19T06:17:36&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0218&#39;, &#39;2025-12-19T06:23:18&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0018&#39;, &#39;2025-12-23T15:55:17&#39;, &#39;{&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0005&#39;, &#39;2025-12-24T18:10:04&#39;, &#39;{&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0093&#39;, &#39;2025-12-21T06:02:38&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0011&#39;, &#39;2025-12-24T18:10:10&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0011&#39;, &#39;2025-12-23T15:55:10&#39;, &#39;{&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0143&#39;, &#39;2025-12-19T06:03:27&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0152&#39;, &#39;2025-12-19T06:17:06&#39;, &#39;{&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0015&#39;, &#39;2025-12-23T15:55:14&#39;, &#39;{&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0001&#39;, &#39;2025-12-23T06:02:00&#39;, &#39;{&#34;marker&#34;: &#34;X&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0209&#39;, &#39;2025-12-19T06:23:09&#39;, &#39;{&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0133&#39;, &#39;2025-12-19T06:03:17&#39;, &#39;{&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0158&#39;, &#39;2025-12-19T06:17:12&#39;, &#39;{&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0195&#39;, &#39;2025-12-19T06:17:46&#39;, &#39;{&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0182&#39;, &#39;2025-12-19T06:17:33&#39;, &#39;{&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0033&#39;, &#39;2025-12-23T15:55:32&#39;, &#39;{&#34;marker&#34;: &#34;W&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0083&#39;, &#39;2025-12-21T06:02:28&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0002&#39;, &#39;2025-12-23T06:02:01&#39;, &#39;{&#34;marker&#34;: &#34;Y&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0076&#39;, &#39;2025-12-23T06:02:22&#39;, &#39;{&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;),&#xA;(&#39;SYS-0009&#39;, &#39;2025-12-24T08:02:08&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;primary&#34;}&#39;),&#xA;(&#39;SYS-0194&#39;, &#39;2025-12-19T06:17:45&#39;, &#39;{&#34;marker&#34;: &#34;U&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;);&#xA;&#xA;```&#xA;&#xA;&#xA;```&#xA;$ sqlite3&#xA;SQLite version 3.50.4 2025-07-30 19:33:53&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day15-inserts.sql&#xA;Parse error near line 19: unrecognized token: &#34;:&#34;&#xA;  6:02:26&#39;, &#39;{&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;::jsonb), (&#39;SYS-0137&#39;, &#39;2025&#xA;                                      error here ---^&#xA;Parse error near line 275: unrecognized token: &#34;:&#34;&#xA;  5:55:12&#39;, &#39;{&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;}&#39;::jsonb), (&#39;SYS-0002&#39;, &#39;2025&#xA;                                      error here ---^&#xA;sqlite&gt; .read day15-inserts-sqlite.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE system_dispatches (&#xA;    id SERIAL PRIMARY KEY,&#xA;    system_id TEXT NOT NULL,&#xA;    dispatched_at TIMESTAMP NOT NULL,&#xA;    payload JSONB NOT NULL,&#xA;    marker_letter TEXT GENERATED ALWAYS AS (json_extract(payload, &#39;$.marker&#39;)) STORED,&#xA;    UNIQUE (system_id, dispatched_at, payload)&#xA;);&#xA;CREATE TABLE incoming_dispatches (&#xA;    system_id TEXT,&#xA;    dispatched_at TIMESTAMP,&#xA;    payload JSONB&#xA;);&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; SELECT * FROM system_dispatches LIMIT 10;&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;| id | system_id |    dispatched_at    |                payload                 | marker_letter |&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;|    | SYS-0081  | 2025-12-21T06:02:26 | {&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;} | T             |&#xA;|    | SYS-0137  | 2025-12-19T06:03:21 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;|    | SYS-0237  | 2025-12-19T06:23:37 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;|    | SYS-0006  | 2025-12-24T18:10:16 | {&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;} | K             |&#xA;|    | SYS-0170  | 2025-12-19T06:17:24 | {&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;} | I             |&#xA;|    | SYS-0224  | 2025-12-19T06:23:24 | {&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;} | S             |&#xA;|    | SYS-0007  | 2025-12-24T18:10:06 | {&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;} | O             |&#xA;|    | SYS-0035  | 2025-12-23T15:55:34 | {&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;} | C             |&#xA;|    | SYS-0010  | 2025-12-23T15:55:09 | {&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;} | L             |&#xA;|    | SYS-0037  | 2025-12-23T15:55:36 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;sqlite&gt; SELECT * FROM incoming_dispatches LIMIT 10;&#xA;+-----------+---------------------+----------------------------------------+&#xA;| system_id |    dispatched_at    |                payload                 |&#xA;+-----------+---------------------+----------------------------------------+&#xA;| SYS-0013  | 2025-12-23T15:55:12 | {&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0002  | 2025-12-24T08:10:01 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;primary&#34;}   |&#xA;| SYS-0012  | 2025-12-23T15:55:11 | {&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0108  | 2025-12-19T06:02:52 | {&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0165  | 2025-12-19T06:17:19 | {&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0189  | 2025-12-19T06:17:40 | {&#34;marker&#34;: &#34;H&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0050  | 2025-12-23T15:55:32 | {&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0138  | 2025-12-19T06:03:22 | {&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0087  | 2025-12-21T06:02:32 | {&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;| SYS-0208  | 2025-12-19T06:23:08 | {&#34;marker&#34;: &#34;G&#34;, &#34;source&#34;: &#34;secondary&#34;} |&#xA;+-----------+---------------------+----------------------------------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Good to go!&#xA;&#xA;## Problem&#xA;&#xA;&gt; Reconstruct the final confirmation phrase to text Santa based on the elves’ hazy recollection of how they solved this problem before.&#xA;&gt; &#xA;&gt; Your final result should include the marker_letter for each system, using only the most recent dispatch from a primary source. Once the correct dispatch has been identified for every system, combine the results and order them by dispatched_at in ascending order to reveal the confirmation phrase.&#xA;&gt; &#xA;&gt; The sleigh won’t launch without it.&#xA;&#xA;&#xA;So, we need to first make sure we have the right `dispatches`.&#xA;&#xA;We have two tables.&#xA;&#xA;1. `system_dispatches`&#xA;2. `incoming_dispatches`&#xA;&#xA;The problem statement also states:&#xA;&#xA;&gt; The system kept throwing errors until we figured out how to handle duplicates. Whatever you do the records already in system_dispatches must take precedence.”&#xA;&#xA;&#xA;So, we need to take the `system_dispatches` as the ground truth. However we need to include something or the other from the `incoming_dispatches` as there are new entries being created.&#xA;&#xA;So, we will first try to insert into the system dispatches everything in the `incoming_dispatches`&#xA;&#xA;```sql&#xA;INSERT INTO system_dispatches (system_id, dispatched_at, payload)&#xA;SELECT system_id, dispatched_at, payload&#xA;FROM incoming_dispatches;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; INSERT INTO system_dispatches (system_id, dispatched_at, payload)&#xA;SELECT system_id, dispatched_at, payload&#xA;FROM incoming_dispatches;&#xA;Runtime error: UNIQUE constraint failed: system_dispatches.system_id, system_dispatches.dispatched_at, system_dispatches.payload (19)&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Ops!&#xA;&#xA;```sql&#xA;SELECT COUNT(*) as system_dispatches_count FROM system_dispatches ;&#xA;SELECT COUNT(*) as incoming_dispatches_count FROM incoming_dispatches;&#xA;SELECT COUNT(*) as common_dispatches_count FROM (SELECT system_id, dispatched_at, payload FROM system_dispatches INTERSECT SELECT * FROM incoming_dispatches);&#xA;&#xA;&#xA;```&#xA;```&#xA;sqlite&gt; SELECT COUNT(*) FROM system_dispatches ;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 254      |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM incoming_dispatches;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 38       |&#xA;+----------+&#xA;&#xA;sqlite&gt; SELECT COUNT(*) FROM (SELECT system_id, dispatched_at, payload FROM system_dispatches UNION SELECT * FROM incoming_dispatches);&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 282      |&#xA;+----------+&#xA;&#xA;sqlite&gt; SELECT DISTINCT COUNT(*) FROM (SELECT system_id, dispatched_at, payload FROM system_dispatches UNION SELECT * FROM incoming_dispatches);&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 282      |&#xA;+----------+&#xA;&#xA;sqlite&gt; SELECT COUNT(*) FROM (SELECT system_id, dispatched_at, payload FROM system_dispatches INTERSECT SELECT * FROM incoming_dispatches);&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 10       |&#xA;+----------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;&#xA;We need to be careful about what we insert into the `system_dispatches`.&#xA;&#xA;I think the duplicates causes this conflict in constraints, we can ignore on the constraint failures and don&#39;t insert that row.&#xA;&#xA;```sql&#xA;INSERT OR IGNORE INTO system_dispatches (system_id, dispatched_at, payload)&#xA;SELECT system_id, dispatched_at, payload&#xA;FROM incoming_dispatches;&#xA;```&#xA;&#xA;Ok, now if we see the count, we get `282` which is making sense.&#xA;&#xA;- The original `system_dispatches` had a count of `254` before merge.&#xA;- The original `incoming_dispatches` had a count of `38`.&#xA;- After merging `incoming_dispatches` into `system_dispatches` table, we have a `282` records in `system_dispatches` but `254+38=292`, what about the `10`? Those were already there in the `system_dispatches`. As we saw in the earlier query, there were `10` rows in common from the `incoming_dispatches` table before merge.&#xA;&#xA;```&#xA;sqlite&gt; SELECT COUNT(*) FROM system_dispatches ;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 254      |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM incoming_dispatches;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 38       |&#xA;+----------+&#xA;sqlite&gt; INSERT INTO system_dispatches (system_id, dispatched_at, payload)&#xA;SELECT system_id, dispatched_at, payload&#xA;FROM incoming_dispatches;&#xA;Runtime error: UNIQUE constraint failed: system_dispatches.system_id, system_dispatches.dispatched_at, system_dispatches.payload (19)&#xA;sqlite&gt; INSERT OR IGNORE INTO system_dispatches (system_id, dispatched_at, payload)&#xA;SELECT system_id, dispatched_at, payload&#xA;FROM incoming_dispatches;&#xA;sqlite&gt; SELECT COUNT(*) FROM system_dispatches ;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 282      |&#xA;+----------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Now, to the actual problem.&#xA;&#xA;What we need to do now?&#xA;&#xA;- Using only the most recent dispatch from a primary source. &#xA;- Once the correct dispatch has been identified for every system, combine the results and order them by dispatched_at in ascending order &#xA;- Reveal the confirmation phrase&#xA;&#xA;We have the single source now!&#xA;&#xA;```sql&#xA;SELECT * FROM system_dispatches LIMIT 10;&#xA;```&#xA;&#xA;```&#xA;SELECT * FROM system_dispatches LIMIT 10;&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;| id | system_id |    dispatched_at    |                payload                 | marker_letter |&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;|    | SYS-0081  | 2025-12-21T06:02:26 | {&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;secondary&#34;} | T             |&#xA;|    | SYS-0137  | 2025-12-19T06:03:21 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;|    | SYS-0237  | 2025-12-19T06:23:37 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;|    | SYS-0006  | 2025-12-24T18:10:16 | {&#34;marker&#34;: &#34;K&#34;, &#34;source&#34;: &#34;secondary&#34;} | K             |&#xA;|    | SYS-0170  | 2025-12-19T06:17:24 | {&#34;marker&#34;: &#34;I&#34;, &#34;source&#34;: &#34;secondary&#34;} | I             |&#xA;|    | SYS-0224  | 2025-12-19T06:23:24 | {&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;secondary&#34;} | S             |&#xA;|    | SYS-0007  | 2025-12-24T18:10:06 | {&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;secondary&#34;} | O             |&#xA;|    | SYS-0035  | 2025-12-23T15:55:34 | {&#34;marker&#34;: &#34;C&#34;, &#34;source&#34;: &#34;secondary&#34;} | C             |&#xA;|    | SYS-0010  | 2025-12-23T15:55:09 | {&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;secondary&#34;} | L             |&#xA;|    | SYS-0037  | 2025-12-23T15:55:36 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;secondary&#34;} | D             |&#xA;+----+-----------+---------------------+----------------------------------------+---------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;We have 4 columns.&#xA;&#xA;- system_id&#xA;- dispatched_at&#xA;- payload (json string)&#xA;- marker (its a generated table from the payload[&#34;marker&#34;] string)&#xA;&#xA;So, we need to group by system, that&#39;s where the `system_id` is for.&#xA;&#xA;Then we need to order by `dispatched_at` with the latest ones.&#xA;&#xA;Finally we also need to filter the records that has `payload[&#34;source&#34;]` as `primary`.&#xA;&#xA;But wait, if we group by `system_id` then how can we get the latest dispatched ordered correctly? We can&#39;t use it in the `ORDER BY` as the group would have already been created right?&#xA;&#xA;Oh! We might need window functions, but just a moment, can we do it without them?&#xA;&#xA;Let&#39;s try.&#xA;&#xA;```sql&#xA;SELECT &#xA;    system_id,&#xA;    MAX(dispatched_at),&#xA;    payload, &#xA;    marker_letter&#xA;FROM system_dispatches&#xA;WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;GROUP BY system_id&#xA;ORDER BY dispatched_at ASC;&#xA;```&#xA;&#xA;Oh boy, that was simple!&#xA;&#xA;We just grouped by `system_id`, filtered the source as `primary` and selected the row with `MAX(dispatched_at)` which will give us the latest dispatch record for a system. Boom!&#xA;&#xA;The confirmation phrase is `ADVENTOFSQL`&#xA;&#xA;We can just select the `marker_letter`&#xA;&#xA;```sql&#xA;SELECT&#xA;    marker_letter&#xA;FROM system_dispatches&#xA;WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;GROUP BY system_id&#xA;ORDER BY dispatched_at ASC;&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; SELECT&#xA;    marker_letter&#xA;FROM system_dispatches&#xA;WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;GROUP BY system_id&#xA;ORDER BY dispatched_at ASC;&#xA;+---------------+&#xA;| marker_letter |&#xA;+---------------+&#xA;| X             |&#xA;| Y             |&#xA;| Z             |&#xA;| M             |&#xA;| P             |&#xA;| R             |&#xA;| K             |&#xA;| B             |&#xA;| U             |&#xA;| C             |&#xA;| J             |&#xA;+---------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Whops! What happened, the marker letter changed?&#xA;&#xA;Yep, because we have ordered by `dispatched_at DESC` but never told how to return the rows, by using `MAX(dispatched_at)` we were selecting the latest date record. Without it, we are selecting the first record in the group that could be the first or the oldest dispatched_at time.&#xA;&#xA;So, we need `MAX(dispatched_at)` included in the selected result set.&#xA;&#xA;&#xA;```sql&#xA;SELECT&#xA;    marker_letter, MAX(dispatched_at)&#xA;FROM system_dispatches&#xA;WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;GROUP BY system_id&#xA;ORDER BY dispatched_at ASC;&#xA;```&#xA;&#xA;```&#xA; SELECT&#xA;    marker_letter, MAX(dispatched_at)&#xA;FROM system_dispatches&#xA;WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;GROUP BY system_id&#xA;ORDER BY dispatched_at ASC;&#xA;+---------------+---------------------+&#xA;| marker_letter | MAX(dispatched_at)  |&#xA;+---------------+---------------------+&#xA;| A             | 2025-12-24T08:10:00 |&#xA;| D             | 2025-12-24T08:10:01 |&#xA;| V             | 2025-12-24T08:10:02 |&#xA;| E             | 2025-12-24T08:10:03 |&#xA;| N             | 2025-12-24T08:10:04 |&#xA;| T             | 2025-12-24T08:10:05 |&#xA;| O             | 2025-12-24T08:10:06 |&#xA;| F             | 2025-12-24T08:10:07 |&#xA;| S             | 2025-12-24T08:10:08 |&#xA;| Q             | 2025-12-24T08:10:09 |&#xA;| L             | 2025-12-24T08:10:10 |&#xA;+---------------+---------------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;There, we go! It scared me for a moment.&#xA;&#xA;We order by, only keep the `source` as `primary` and then select the latest `dispatched_at` for a given system_id.&#xA;&#xA;If you don&#39;t want to do this way! I found a few more hacks&#xA;&#xA;### With Subquery&#xA;&#xA;We can SELECT the `MAX(dispatched_at)` in a subquery for that system_id and filter based on the `primary` source as usual.&#xA;&#xA;```sql&#xA;SELECT &#xA;    *&#xA;FROM system_dispatches&#xA;WHERE &#xA;    json_extract(system_dispatches.payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;    AND system_dispatches.dispatched_at = (&#xA;        SELECT &#xA;            MAX(latest_dispatches.dispatched_at)&#xA;        FROM system_dispatches latest_dispatches&#xA;        WHERE &#xA;            latest_dispatches.system_id = system_dispatches.system_id&#xA;            AND json_extract(latest_dispatches.payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;   )&#xA;ORDER BY system_dispatches.dispatched_at ASC;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT &#xA;    *&#xA;FROM system_dispatches&#xA;WHERE &#xA;    json_extract(system_dispatches.payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;    AND system_dispatches.dispatched_at = (&#xA;        SELECT &#xA;            MAX(latest_dispatches.dispatched_at)&#xA;        FROM system_dispatches latest_dispatches&#xA;        WHERE &#xA;            latest_dispatches.system_id = system_dispatches.system_id&#xA;            AND json_extract(latest_dispatches.payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;   )&#xA;ORDER BY system_dispatches.dispatched_at ASC;&#xA;&#xA;+----+-----------+---------------------+--------------------------------------+---------------+&#xA;| id | system_id |    dispatched_at    |               payload                | marker_letter |&#xA;+----+-----------+---------------------+--------------------------------------+---------------+&#xA;|    | SYS-0001  | 2025-12-24T08:10:00 | {&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;primary&#34;} | A             |&#xA;|    | SYS-0002  | 2025-12-24T08:10:01 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;primary&#34;} | D             |&#xA;|    | SYS-0003  | 2025-12-24T08:10:02 | {&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;primary&#34;} | V             |&#xA;|    | SYS-0004  | 2025-12-24T08:10:03 | {&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;primary&#34;} | E             |&#xA;|    | SYS-0005  | 2025-12-24T08:10:04 | {&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;primary&#34;} | N             |&#xA;|    | SYS-0006  | 2025-12-24T08:10:05 | {&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;primary&#34;} | T             |&#xA;|    | SYS-0007  | 2025-12-24T08:10:06 | {&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;primary&#34;} | O             |&#xA;|    | SYS-0008  | 2025-12-24T08:10:07 | {&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;primary&#34;} | F             |&#xA;|    | SYS-0009  | 2025-12-24T08:10:08 | {&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;primary&#34;} | S             |&#xA;|    | SYS-0010  | 2025-12-24T08:10:09 | {&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;primary&#34;} | Q             |&#xA;|    | SYS-0011  | 2025-12-24T08:10:10 | {&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;primary&#34;} | L             |&#xA;+----+-----------+---------------------+--------------------------------------+---------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;&#xA;&#xA;### Window Funciton&#xA;&#xA;We can even use the window function to partition by the `system_id` and there itself order by the `dispatched_at` latest time. And then select it as a CTE table.&#xA;&#xA;We can use a `ROW_NUMBER` function to assign a rank to each row per system_id and ordered by the latest `dispatched_at`&#xA;&#xA;```sql&#xA;WITH latest_dispatches AS (&#xA;    SELECT&#xA;        *,&#xA;        ROW_NUMBER() OVER (&#xA;            PARTITION BY system_id&#xA;            ORDER BY dispatched_at DESC&#xA;        ) as rank&#xA;    FROM system_dispatches&#xA;    WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;) SELECT * FROM latest_dispatches WHERE rank = 1 ORDER BY dispatched_at ASC;&#xA;```&#xA;&#xA;The rest is the same as the above query. We order by dispatched latest but then order by ascending (oldest first) in the final result set.&#xA;&#xA;```&#xA;sqlite&gt; WITH latest_dispatches AS (&#xA;    SELECT&#xA;        *,&#xA;        ROW_NUMBER() OVER (&#xA;            PARTITION BY system_id&#xA;            ORDER BY dispatched_at DESC&#xA;        ) as rank&#xA;    FROM system_dispatches&#xA;    WHERE json_extract(payload, &#39;$.source&#39;) = &#39;primary&#39;&#xA;) SELECT * FROM latest_dispatches WHERE rank = 1 ORDER BY dispatched_at ASC;&#xA;+----+-----------+---------------------+--------------------------------------+---------------+------+&#xA;| id | system_id |    dispatched_at    |               payload                | marker_letter | rank |&#xA;+----+-----------+---------------------+--------------------------------------+---------------+------+&#xA;|    | SYS-0001  | 2025-12-24T08:10:00 | {&#34;marker&#34;: &#34;A&#34;, &#34;source&#34;: &#34;primary&#34;} | A             | 1    |&#xA;|    | SYS-0002  | 2025-12-24T08:10:01 | {&#34;marker&#34;: &#34;D&#34;, &#34;source&#34;: &#34;primary&#34;} | D             | 1    |&#xA;|    | SYS-0003  | 2025-12-24T08:10:02 | {&#34;marker&#34;: &#34;V&#34;, &#34;source&#34;: &#34;primary&#34;} | V             | 1    |&#xA;|    | SYS-0004  | 2025-12-24T08:10:03 | {&#34;marker&#34;: &#34;E&#34;, &#34;source&#34;: &#34;primary&#34;} | E             | 1    |&#xA;|    | SYS-0005  | 2025-12-24T08:10:04 | {&#34;marker&#34;: &#34;N&#34;, &#34;source&#34;: &#34;primary&#34;} | N             | 1    |&#xA;|    | SYS-0006  | 2025-12-24T08:10:05 | {&#34;marker&#34;: &#34;T&#34;, &#34;source&#34;: &#34;primary&#34;} | T             | 1    |&#xA;|    | SYS-0007  | 2025-12-24T08:10:06 | {&#34;marker&#34;: &#34;O&#34;, &#34;source&#34;: &#34;primary&#34;} | O             | 1    |&#xA;|    | SYS-0008  | 2025-12-24T08:10:07 | {&#34;marker&#34;: &#34;F&#34;, &#34;source&#34;: &#34;primary&#34;} | F             | 1    |&#xA;|    | SYS-0009  | 2025-12-24T08:10:08 | {&#34;marker&#34;: &#34;S&#34;, &#34;source&#34;: &#34;primary&#34;} | S             | 1    |&#xA;|    | SYS-0010  | 2025-12-24T08:10:09 | {&#34;marker&#34;: &#34;Q&#34;, &#34;source&#34;: &#34;primary&#34;} | Q             | 1    |&#xA;|    | SYS-0011  | 2025-12-24T08:10:10 | {&#34;marker&#34;: &#34;L&#34;, &#34;source&#34;: &#34;primary&#34;} | L             | 1    |&#xA;+----+-----------+---------------------+--------------------------------------+---------------+------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Phew!&#xA;&#xA;That was a good one to end the advent of SQL!&#xA;&#xA;I enjoyed it!&#xA;&#xA;I learnt a ton&#xA;- CTEs&#xA;- JOINs (some wired stuff can be done)&#xA;- Window Functions (LAG, LEAD, ROW_NUMBER)&#xA;- FTS (in SQLite)&#xA;- Json parsing &#xA;- date shenanigans&#xA;- CTEs don&#39;t support insert and delete in SQLite (it ruined my day 10 solution)&#xA;- Recursive CTEs&#xA;- String manipulation (thanks to xml parsing)&#xA;- I need to write a post about explaining what I learned.&#xA;&#xA;Thanks [Aaron francis](https://aaronfrancis.com/) from [databaseschool](https://databaseschool.com/) for this challenge and explaining in depth on each day(i didn&#39;t watch all) and [Kelsey Petrich](https://x.com/krpetrich) for the lore of each problem, those were really lovely to read!&#xA;&#xA;Happy Coding :)&#xA;Merry Christmas&#xA;Happy New year&#xA;Whatever you celebrate!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 14: Ski Resort Paths</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-14</link>
      <description>Advent of SQL, Day 14 - Ski Resort Paths Ok, almost to the penultimate day in the series. It is day 14 of Advent of SQL. Let&#39;s grab the SQL for the day. That&#39;s</description>
      <pubDate>Sun, 28 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL, Day 14 - Ski Resort Paths&#xA;&#xA;Ok, almost to the penultimate day in the series. It is day 14 of Advent of SQL.&#xA;&#xA;Let&#39;s grab the SQL for the day.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS mountain_network;&#xA;&#xA;&#xA;CREATE TABLE mountain_network (&#xA;    id INTEGER PRIMARY KEY,&#xA;    from_node TEXT,&#xA;    to_node TEXT,&#xA;    node_type TEXT,    -- &#39;Lift&#39; or &#39;Trail&#39;&#xA;    difficulty TEXT    -- Only applicable for trails: &#39;green&#39;, &#39;blue&#39;, &#39;black&#39;, &#39;double_black&#39;&#xA;);&#xA;&#xA;INSERT INTO mountain_network (id, from_node, to_node, node_type, difficulty) VALUES&#xA;(1, &#39;Outlaw Express&#39;, &#39;Stairway Lift&#39;, &#39;Lift&#39;, NULL),&#xA;(2, &#39;Outlaw Express&#39;, &#39;Top Gun Bowl&#39;, &#39;Trail&#39;, &#39;black&#39;),&#xA;(3, &#39;Top Gun Bowl&#39;, &#39;Top Gun&#39;, &#39;Trail&#39;, &#39;black&#39;),&#xA;(4, &#39;Top Gun&#39;, &#39;Montoya&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(5, &#39;Montoya&#39;, &#39;Center Aisle&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(6, &#39;Center Aisle&#39;, &#39;Lower Stampede&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(7, &#39;Stairway Lift&#39;, &#39;Red&#39;&#39;s Lift&#39;, &#39;Lift&#39;, NULL),&#xA;(8, &#39;Stairway Lift&#39;, &#39;Broadway&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(9, &#39;Red&#39;&#39;s Lift&#39;, &#39;Bearclaw&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(10, &#39;Bearclaw&#39;, &#39;Last Chance&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(11, &#39;Last Chance&#39;, &#39;Diamondback&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(12, &#39;Diamondback&#39;, &#39;Broadway&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(13, &#39;Red&#39;&#39;s Lift&#39;, &#39;Bishop&#39;&#39;s Bowl&#39;, &#39;Trail&#39;, &#39;black&#39;),&#xA;(14, &#39;Red&#39;&#39;s Lift&#39;, &#39;Amy&#39;&#39;s Ridge&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(15, &#39;Amy&#39;&#39;s Ridge&#39;, &#39;Grizzly Bowl&#39;, &#39;Trail&#39;, &#39;black&#39;),&#xA;(16, &#39;Flathead Lift&#39;, &#39;Amy&#39;&#39;s Ridge&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(17, &#39;Jake&#39;&#39;s Lift&#39;, &#39;Wildwood Lift&#39;, &#39;Lift&#39;, NULL),&#xA;(18, &#39;Wildwood Lift&#39;, &#39;Sidewinder&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(19, &#39;Wildwood Lift&#39;, &#39;Brightside&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(20, &#39;Brightside&#39;, &#39;Moonrise&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(21, &#39;Moonrise&#39;, &#39;Draw&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(22, &#39;Moonrise&#39;, &#39;Lone Pine&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(23, &#39;Draw&#39;, &#39;Maverick&#39;, &#39;Trail&#39;, &#39;blue&#39;),&#xA;(24, &#39;Draw&#39;, &#39;Broadway&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(25, &#39;Broadway&#39;, &#39;Outlaw Trail&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(26, &#39;Outlaw Trail&#39;, &#39;Center Aisle&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(27, &#39;Center Aisle&#39;, &#39;Bandit&#39;, &#39;Trail&#39;, &#39;green&#39;),&#xA;(28, &#39;Jake&#39;&#39;s Lift&#39;, &#39;Maverick&#39;, &#39;Trail&#39;, &#39;blue&#39;);&#xA;&#xA;```&#xA;&#xA;That&#39;s it, we just have 1 table called `mountain_network` and it has `28` records.&#xA;&#xA;Let&#39;s look at the problem statement to check what we need to do to make sense of those 28 rows. &#xA;&#xA;&#xA;## Problem&#xA;&#xA;&gt; Find all the possible routes from Jake&#39;s Lift to Maverick. None of the possible routes will take more than 12 connections.&#xA;&#xA;&#xA;Oh! This is a graph like or network like problem.&#xA;&#xA;Ouch! &#xA;&#xA;Relational databases looks like they would be a good fit for these stuff, but if the data is just in the single table, there&#39;s not really much help.&#xA;&#xA;Let&#39;s see how we can think about it.&#xA;&#xA;### JOINs and UNIONS&#xA;&#xA;We need to find all the routes from a start node(record) i.e. `Jake&#39;s Lift` and find all ways that lead to the node(record) i.e. `Maverick`. The table gives a list of edges i.e. from which node to which node there is a way or a trail or lift.&#xA;&#xA;So, we can do a simple select to check if the `from_node` has `Jake&#39;s Lift` and the `to_node` has `Maverick` right?&#xA;&#xA;But that would make it too long.&#xA;&#xA;Like we would have to look for 12 consecutive edges and branch off at each direction.&#xA;&#xA;For the first level it would look simple like this:&#xA;&#xA;```sql&#xA;SELECT &#xA;    mountain_network.from_node || &#39; -&gt; &#39; || mountain_network.to_node AS path, &#xA;    1 AS connections&#xA;FROM mountain_network&#xA;WHERE &#xA;    mountain_network.from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;    AND mountain_network.to_node = &#39;Maverick&#39;;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT mountain_network.from_node || &#39; -&gt; &#39; || mountain_network.to_node AS path, 1 AS connections&#xA;FROM mountain_network&#xA;WHERE mountain_network.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND mountain_network.to_node = &#39;Maverick&#39;&#xA;   ...&gt; ;&#xA;+-------------------------+-------------+&#xA;|          path           | connections |&#xA;+-------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Maverick | 1           |&#xA;+-------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Luckily, we have one route directly from `Jake&#39;s Lift` to `Maverick`. But that might not be for all levels. Like we need to check then, from each of the start nodes that begin at `Jake&#39;s Lift`, then for all the nodes that begin at that then it branches of further till we have a dead end only at `Maverick`.&#xA;&#xA;Phew that&#39;s going to be a long one.&#xA;&#xA;```sql&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node, 2                     FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;WHERE &#xA;    T1.from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;    AND T2.to_node = &#39;Maverick&#39;;&#xA;```&#xA;&#xA;That is the first branch off from `Jake&#39;s Lift`. If we remove the `T2.to_node = &#39;Maverick&#39;` you would see all nodes from the `Jake&#39;s Lift`, not necessarily ending at `Maverick`&#xA;&#xA;```sql&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node as path,&#xA; 2 as connections&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39;;&#xA;``` &#xA;&#xA;```&#xA;sqlite&gt; SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node as path,&#xA; 2 as connections&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39;;&#xA;+--------------------------------------------+-------------+&#xA;|                    path                    | connections |&#xA;+--------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside | 2           |&#xA;| Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Sidewinder | 2           |&#xA;+--------------------------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Similarly we can do it for 3 and 4, and till 12 connections.&#xA;&#xA;```sql&#xA;SELECT &#xA;    T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node as path,&#xA;    3 as connections&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39;;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node as path, 3 as connections&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;   ...&gt; ;&#xA;+--------------------------------------------------------+-------------+&#xA;|                          path                          | connections |&#xA;+--------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise | 3           |&#xA;+--------------------------------------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;&#xA;Yikes.&#xA;&#xA;That gives a long long query. I cannot write.&#xA;&#xA;```sql&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node AS path, 1 AS connections&#xA;FROM mountain_network T1&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T1.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node, 2&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T2.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node, 3&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T3.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node, 4&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T4.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node, 5&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T5.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node, 6&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T6.to_node = &#39;Maverick&#39;&#xA;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node, 7&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T7.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node || &#39; -&gt; &#39; || T8.to_node, 8&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;JOIN mountain_network T8 ON T7.to_node = T8.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T8.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node || &#39; -&gt; &#39; || T8.to_node || &#39; -&gt; &#39; || T9.to_node, 9&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;JOIN mountain_network T8 ON T7.to_node = T8.from_node&#xA;JOIN mountain_network T9 ON T8.to_node = T9.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T9.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node || &#39; -&gt; &#39; || T8.to_node || &#39; -&gt; &#39; || T9.to_node || &#39; -&gt; &#39; || T10.to_node, 10&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;JOIN mountain_network T8 ON T7.to_node = T8.from_node&#xA;JOIN mountain_network T9 ON T8.to_node = T9.from_node&#xA;JOIN mountain_network T10 ON T9.to_node = T10.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T10.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node || &#39; -&gt; &#39; || T8.to_node || &#39; -&gt; &#39; || T9.to_node || &#39; -&gt; &#39; || T10.to_node || &#39; -&gt; &#39; || T11.to_node, 11&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;JOIN mountain_network T8 ON T7.to_node = T8.from_node&#xA;JOIN mountain_network T9 ON T8.to_node = T9.from_node&#xA;JOIN mountain_network T10 ON T9.to_node = T10.from_node&#xA;JOIN mountain_network T11 ON T10.to_node = T11.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T11.to_node = &#39;Maverick&#39;&#xA;&#xA;UNION ALL&#xA;&#xA;SELECT T1.from_node || &#39; -&gt; &#39; || T1.to_node || &#39; -&gt; &#39; || T2.to_node || &#39; -&gt; &#39; || T3.to_node || &#39; -&gt; &#39; || T4.to_node || &#39; -&gt; &#39; || T5.to_node || &#39; -&gt; &#39; || T6.to_node || &#39; -&gt; &#39; || T7.to_node || &#39; -&gt; &#39; || T8.to_node || &#39; -&gt; &#39; || T9.to_node || &#39; -&gt; &#39; || T10.to_node || &#39; -&gt; &#39; || T11.to_node || &#39; -&gt; &#39; || T12.to_node, 12&#xA;FROM mountain_network T1&#xA;JOIN mountain_network T2 ON T1.to_node = T2.from_node&#xA;JOIN mountain_network T3 ON T2.to_node = T3.from_node&#xA;JOIN mountain_network T4 ON T3.to_node = T4.from_node&#xA;JOIN mountain_network T5 ON T4.to_node = T5.from_node&#xA;JOIN mountain_network T6 ON T5.to_node = T6.from_node&#xA;JOIN mountain_network T7 ON T6.to_node = T7.from_node&#xA;JOIN mountain_network T8 ON T7.to_node = T8.from_node&#xA;JOIN mountain_network T9 ON T8.to_node = T9.from_node&#xA;JOIN mountain_network T10 ON T9.to_node = T10.from_node&#xA;JOIN mountain_network T11 ON T10.to_node = T11.from_node&#xA;JOIN mountain_network T12 ON T11.to_node = T12.from_node&#xA;WHERE T1.from_node = &#39;Jake&#39;&#39;s Lift&#39; AND T12.to_node = &#39;Maverick&#39;;&#xA;```&#xA;&#xA;```&#xA;+--------------------------------------------------------------+-------------+&#xA;|                             path                             | connections |&#xA;+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Maverick                                      | 1           |&#xA;+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 5           |&#xA;| aw -&gt; Maverick                                               |             |&#xA;+--------------------------------------------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;&#xA;Ok, we get the result, but that looks like a terrifying query.&#xA;&#xA;Can we do better?&#xA;&#xA;I looked at the tutorial and we have something like `RECURSIVE CTE`&#xA;&#xA;Wow!&#xA;&#xA;This is quite challenging to explain.&#xA;&#xA;&#xA;### Recursive CTE&#xA;&#xA;We can create a recursive CTE that draws the path until we have not found the stop node that is, `Maverick` and the number of connections is not more than `12`.&#xA;&#xA;So here&#39;s how it goes.&#xA;&#xA;&#xA;```sql&#xA;WITH RECURSIVE ski_paths AS (&#xA;    SELECT&#xA;        from_node,&#xA;        to_node,&#xA;        CAST(from_node || &#39; -&gt; &#39; || to_node AS TEXT) AS full_path,&#xA;        1 AS connections&#xA;    FROM mountain_network&#xA;    WHERE from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;&#xA;    UNION ALL&#xA;&#xA;    SELECT&#xA;        mn.from_node,&#xA;        mn.to_node,&#xA;        sp.full_path || &#39; -&gt; &#39; || mn.to_node,&#xA;        sp.connections + 1&#xA;    FROM ski_paths sp&#xA;    JOIN mountain_network mn ON sp.to_node = mn.from_node&#xA;    WHERE sp.connections &lt; 12&#xA;      AND sp.to_node != &#39;Maverick&#39;&#xA;)&#xA;SELECT full_path, connections&#xA;FROM ski_paths&#xA;WHERE to_node = &#39;Maverick&#39;&#xA;ORDER BY connections ASC;&#xA;```&#xA;We first define the base case i.e. what to select first. We get the first row where we start, the `Jake&#39;s Lift` and from there on we create a recursive select statement by referencing the CTE within it.&#xA;&#xA;The recursive bit is this one &#xA;&#xA;```sql&#xA; SELECT&#xA;        mn.from_node,&#xA;        mn.to_node,&#xA;        sp.full_path || &#39; -&gt; &#39; || mn.to_node,&#xA;        sp.connections + 1&#xA;    FROM ski_paths sp&#xA;    JOIN mountain_network mn ON sp.to_node = mn.from_node&#xA;    WHERE sp.connections &lt; 12&#xA;      AND sp.to_node != &#39;Maverick&#39;&#xA;```&#xA;&#xA;We are referencing the CTE (table) within it as the recursive condition to select from the nodes where the end node or the `to_node` is not Maverick and connections haven&#39;t gone beyond `12`, we try to find the path.&#xA;&#xA;This would give us all the nodes.&#xA;&#xA;Let&#39;s see first, by checking the CTE as is.&#xA;&#xA;```sql&#xA;WITH RECURSIVE ski_paths AS (&#xA;    SELECT&#xA;        from_node,&#xA;        to_node,&#xA;        CAST(from_node || &#39; -&gt; &#39; || to_node AS TEXT) AS full_path,&#xA;        1 AS connections&#xA;    FROM mountain_network&#xA;    WHERE from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;&#xA;    UNION ALL&#xA;&#xA;    SELECT&#xA;        mn.from_node,&#xA;        mn.to_node,&#xA;        sp.full_path || &#39; -&gt; &#39; || mn.to_node,&#xA;        sp.connections + 1&#xA;    FROM ski_paths sp&#xA;    JOIN mountain_network mn ON sp.to_node = mn.from_node&#xA;    WHERE sp.connections &lt; 12&#xA;      AND sp.to_node != &#39;Maverick&#39;&#xA;)&#xA;SELECT * from ski_paths;&#xA;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; WITH RECURSIVE ski_paths AS (&#xA;    SELECT&#xA;        from_node,&#xA;        to_node,&#xA;        CAST(from_node || &#39; -&gt; &#39; || to_node AS TEXT) AS full_path,&#xA;        1 AS connections&#xA;    FROM mountain_network&#xA;    WHERE from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;&#xA;    UNION ALL&#xA;&#xA;    SELECT&#xA;        mn.from_node,&#xA;        mn.to_node,&#xA;        sp.full_path || &#39; -&gt; &#39; || mn.to_node,&#xA;        sp.connections + 1&#xA;    FROM ski_paths sp&#xA;    JOIN mountain_network mn ON sp.to_node = mn.from_node&#xA;    WHERE sp.connections &lt; 12&#xA;      AND sp.to_node != &#39;Maverick&#39;&#xA;)&#xA;SELECT * from ski_paths;&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;|   from_node   |    to_node     |                          full_path                           | connections |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift   | Wildwood Lift  | Jake&#39;s Lift -&gt; Wildwood Lift                                 | 1           |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift   | Maverick       | Jake&#39;s Lift -&gt; Maverick                                      | 1           |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Wildwood Lift | Brightside     | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside                   | 2           |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Wildwood Lift | Sidewinder     | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Sidewinder                   | 2           |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Brightside    | Moonrise       | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise       | 3           |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Moonrise      | Draw           | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 4           |&#xA;|               |                | aw                                                           |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Moonrise      | Lone Pine      | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Lo | 4           |&#xA;|               |                | ne Pine                                                      |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Draw          | Broadway       | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 5           |&#xA;|               |                | aw -&gt; Broadway                                               |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Draw          | Maverick       | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 5           |&#xA;|               |                | aw -&gt; Maverick                                               |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Broadway      | Outlaw Trail   | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 6           |&#xA;|               |                | aw -&gt; Broadway -&gt; Outlaw Trail                               |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Outlaw Trail  | Center Aisle   | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 7           |&#xA;|               |                | aw -&gt; Broadway -&gt; Outlaw Trail -&gt; Center Aisle               |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Center Aisle  | Bandit         | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 8           |&#xA;|               |                | aw -&gt; Broadway -&gt; Outlaw Trail -&gt; Center Aisle -&gt; Bandit     |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;| Center Aisle  | Lower Stampede | Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 8           |&#xA;|               |                | aw -&gt; Broadway -&gt; Outlaw Trail -&gt; Center Aisle -&gt; Lower Stam |             |&#xA;|               |                | pede                                                         |             |&#xA;+---------------+----------------+--------------------------------------------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;You can see we have got quite a lot of paths, this are all the paths that start from `Jake&#39;s Lift` and have less than `12` connections.&#xA;&#xA;So, now we can simply filter with the `WHERE` clause in the case where the `to_node = &#39;Maverick&#39;` and we would get the result.&#xA;&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH RECURSIVE ski_paths AS (&#xA;    SELECT&#xA;        from_node,&#xA;        to_node,&#xA;        CAST(from_node || &#39; -&gt; &#39; || to_node AS TEXT) AS full_path,&#xA;        1 AS connections&#xA;    FROM mountain_network&#xA;    WHERE from_node = &#39;Jake&#39;&#39;s Lift&#39;&#xA;&#xA;    UNION ALL&#xA;&#xA;    SELECT&#xA;        mn.from_node,&#xA;        mn.to_node,&#xA;        sp.full_path || &#39; -&gt; &#39; || mn.to_node,&#xA;        sp.connections + 1&#xA;    FROM ski_paths sp&#xA;    JOIN mountain_network mn ON sp.to_node = mn.from_node&#xA;    WHERE sp.connections &lt; 12&#xA;      AND sp.to_node != &#39;Maverick&#39;&#xA;)&#xA;SELECT full_path, connections&#xA;FROM ski_paths&#xA;WHERE to_node = &#39;Maverick&#39;&#xA;ORDER BY connections ASC;&#xA;+--------------------------------------------------------------+-------------+&#xA;|                          full_path                           | connections |&#xA;+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Maverick                                      | 1           |&#xA;+--------------------------------------------------------------+-------------+&#xA;| Jake&#39;s Lift -&gt; Wildwood Lift -&gt; Brightside -&gt; Moonrise -&gt; Dr | 5           |&#xA;| aw -&gt; Maverick                                               |             |&#xA;+--------------------------------------------------------------+-------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Ok, that gave the result correctly.&#xA;&#xA;And this is it!&#xA;&#xA;Sweet problem to learn about Recursive CTE!&#xA;&#xA;Off to the final day of advent of SQL 2025!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 12: Archive Flight Records</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-12</link>
      <description>Advent of SQL - Day 12, Archive Flight Records We are on Day 12! Phew its almost done! Just 3 days more! Let&#39;s get the SQL! We have just one table and a couple</description>
      <pubDate>Sat, 27 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL - Day 12, Archive Flight Records&#xA;&#xA;We are on Day 12! Phew its almost done! Just 3 days more!&#xA;&#xA;Let&#39;s get the SQL!&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS archive_records;&#xA;&#xA;CREATE TABLE archive_records (&#xA;    id INT PRIMARY KEY,&#xA;    title TEXT,&#xA;    description TEXT&#xA;);&#xA;&#xA;INSERT INTO archive_records (id, title, description) VALUES&#xA;(1, &#39;Flight Stabilization Prototype Analysis&#39;, &#39;This report details the latest advancements in stabilizing aerial maneuvers for enchanted sleighs. Initial tests yielded promising results, showcasing a marked decrease in turbulence during airborne navigation.&#39;),&#xA;(2, &#39;Lift Calibration Incident Log&#39;, &#39;During a routine lift calibration test, the sleigh experienced an unexpected upward surge, causing it to hover dangerously close to the workshop ceiling. Subsequent analysis revealed a miscalculation in the weight distribution formula, prompting a thorough review of all aerodynamic coefficients.&#39;),&#xA;(3, &#39;Aerial Aspirations: The Great Flop&#39;, &#39;Despite the initial excitement surrounding the design of our feather-laden airborne contraption, the prototype proved less than buoyant. The unexpected descent resulted in a rather spectacular cloud of glitter and twigs, serving as a vivid reminder that not all dreams of flight take wing as intended.&#39;),&#xA;(4, &#39;Optimized Sled Dynamics&#39;, &#39;This document explores advanced mechanics behind sled propulsion and movement efficiency. Through extensive calculations and enchanted material tests, the elves aim to refine turning capabilities and enhance downhill speed.&#39;),&#xA;(5, &#39;Reindeer Harness Design Flaws&#39;, &#39;The latest prototype of our reindeer harness was found to inhibit mobility, causing undue strain on the animals during testing. Observations indicated that the weight distribution was poorly calibrated, necessitating a complete redesign for optimal comfort and performance.&#39;),&#xA;(6, &#39;Streamlined Cargo Routing System&#39;, &#39;In our latest endeavor, we have implemented a magical algorithm to optimize the route taken by our toy-laden sleighs. This groundbreaking update minimizes transport time between the workshop and delivery points, ensuring that holiday cheer reaches every home even faster!&#39;),&#xA;(7, &#39;Toy Durability Testing Protocols&#39;, &#39;The elves meticulously conducted stress evaluations on the latest toy prototypes to determine their resilience under various conditions. Initial findings indicate that while some designs withstood rigorous play, others required reinforcements to avoid premature wear and tear.&#39;),&#xA;(8, &#39;Weather Resistance Breakthrough: Frost Shielding&#39;, &#39;In our recent experiments, we discovered an innovative composite material that effectively repels moisture while withstanding extreme cold. This newfound frost shielding could revolutionize our outdoor toys, ensuring they remain both functional and enchanting, even in the harshest winter conditions.&#39;),&#xA;(9, &#39;Safety Compliance Check Overview&#39;, &#39;In the pursuit of enchantment and joy, this document outlines the mandatory safety compliance measures for all workshop operations. Each elf must adhere strictly to these guidelines to ensure the safe transport and handling of our delicate prototypes, thus preventing any unforeseen magical mishaps.&#39;),&#xA;(10, &#39;Sleigh Skim Mechanism Upgrade&#39;, &#39;The experiment involved enchanting the underside of the sleigh with a whispering wind charm to achieve unprecedented speeds. Results were alarming, leading to uncontrollable flight trajectories and sudden descents—definitely do not attempt again.&#39;),&#xA;(11, &#39;Caution: Enchanted Toy Prototype&#39;, &#39;This design incorporates a reactive magic component that may unpredictably animate in the presence of mischief. Ensure all test environments are secured against spontaneous giggles and potential chaos.&#39;),&#xA;(12, &#39;Workshop Experiment Safety Checklist&#39;, &#39;Before embarking on any workshop experiments, ensure all safety goggles are securely fastened to prevent debris from interfered visions. Always double-check that the workspace is free of clutter, as unexpected accidents can arise from even the smallest flurry of trinkets and tools.&#39;),&#xA;(13, &#39;Cocoa Bean Roasting Innovations&#39;, &#39;This review explores the latest techniques in roasting cocoa beans to achieve unparalleled flavor profiles. Adjustments to temperature and timing have led to a delightful spectrum of aromas, promising to elevate our confectionery creations to new heights.&#39;),&#xA;(14, &#39;Intricate Snowflake Ornament Design&#39;, &#39;This design blueprint outlines the geometric intricacies of a multi-faceted snowflake ornament, emphasizing a balance between elegance and structural integrity. Each arm is meticulously patterned to reflect light, creating a shimmering effect that dances with the seasons, while ensuring optimal symmetry for enchanting visual appeal.&#39;),&#xA;(15, &#39;Gift Box Assembly Prototype&#39;, &#39;This prototype outlines the intricate process of assembling the enchanted gift boxes designed to withstand the whims of time and space. Each step must be meticulously executed to ensure that every box not only sparkles with joy but also maintains its magical properties through every unwrapping.&#39;),&#xA;(16, &#39;Wrap-It-Up: Innovative Designs&#39;, &#39;This experimental report explores various materials and techniques for creating enchanted wrapping paper that enhances the gift-giving experience. Initial findings suggest that incorporation of shimmering elven dust can amplify the aesthetic appeal while maintaining structural integrity during airborne delivery.&#39;),&#xA;(17, &#39;Magical Confection Fusion Results&#39;, &#39;The experimental concoction blended sugar crystals with essence of starlight, resulting in a luminescent treat that sparkles enchantingly. However, a curious side effect was noted: excessive giggling among taste testers, raising questions about potential airborne laughter.&#39;),&#xA;(18, &#39;Luminous Ornament Crafting Techniques&#39;, &#39;In our continuous pursuit of radiance, this document outlines innovative methods for creating ornaments that glow with enchantment. Engaging both traditional techniques and modern enchantments, each design is meant to instill joy and sparkle during the festive season.&#39;),&#xA;(19, &#39;Magical Energy Conduction Analysis&#39;, &#39;Recent experiments have shown that the flow of magical energy through crystalline conduits behaves unpredictably under varying lunar phases. Further investigation into the correlation between ambient mana levels and energy stability is necessary to optimize enchantment potency.&#39;),&#xA;(20, &#39;Elven Workshop Organization Protocols&#39;, &#39;The implementation of open shelving systems has significantly increased accessibility to essential materials, thus enhancing workflow efficiency. Furthermore, the organization of tools into color-coded bins ensures that each elf can swiftly locate their required implements without disrupting the harmony of the workshop.&#39;);&#xA;&#xA;```&#xA; &#xA;We have just one table and a couple of text like columns. That&#39;s it, looks like a string searching problem.&#xA;&#xA;Let&#39;s head to the problem statement!&#xA;&#xA;&#xA;## Problem&#xA;&#xA;&gt; Using the `archive_records` table, search both the `title` and `description` fields for the term &#34;fly&#34;. Make sure that you also match for words like &#34;flying&#34;, &#34;flight&#34;, etc. Boost the results where the term appears in the title and lastly, rank the results by relevance (most relevant first). Provide the elves the top 5 most relevant archived records back.&#xA;&#xA;It is a text search use case problem indeed!&#xA;&#xA;We need to find and rank the records matching the word `fly`, `flying`, `flight` like those. Kind of tricky if we miss any terms that are not hard-coded.&#xA;&#xA;We need to boost the search term in `title`, so there is more weightage if the term appears in title than in description. Makes sense!&#xA;&#xA;Let&#39;s start simple and move into full-text-search in SQLite!&#xA;&#xA;&#xA;### Simple String Matching&#xA;&#xA;We start by a simple nested `CASE WHEN THEN` condition. We check if the `title` has `fly`, `flying`, `flight`, etc then we set the rank as `2` and add the score from `description` as `1` if the same terms appear in description.&#xA;&#xA;So,&#xA;- If the search term (fly, flight, etc) appears **only** in title, score is `2`&#xA;- If the search term appears **only** in description, score is `1`&#xA;- If the search term appears in **both** title and description, the score is `3` since we are adding the scores.&#xA;- If the search term doesn&#39;t appear at all, then the score remains `0`.&#xA;&#xA;We simply assign the score based on the appearance of the search term and then order the result based on the computed `rank` and list the top 5.&#xA;&#xA;```sql&#xA; SELECT &#xA;    id,&#xA;    title,&#xA;    description,&#xA;    (&#xA;        CASE WHEN LOWER(title) LIKE &#39;%fly%&#39; OR LOWER(title) LIKE &#39;%flight%&#39; OR LOWER(title) LIKE &#39;%flying%&#39; THEN 2 ELSE 0 END +&#xA;        CASE WHEN LOWER(description) LIKE &#39;%fly%&#39; OR LOWER(description) LIKE &#39;%flight%&#39; OR LOWER(description) LIKE &#39;%flying%&#39; THEN 1 ELSE 0 END&#xA;    ) AS rank&#xA;FROM archive_records&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;However, we can limit the search space by eliminating the computation of ranking all the records with a `WHERE` clause since it can optimise only the relevant records before giving out the result set.&#xA;&#xA;Hence, we filter only in the cases where the `title` and `description` have the relevant word and then use the condition to order by the `rank` score that we computed and list the top `5` ones.&#xA;&#xA;```sql&#xA;SELECT &#xA;    id,&#xA;    title,&#xA;    description,&#xA;    (&#xA;        -- Title matches worth 2 points (boosted relevance)&#xA;        CASE WHEN LOWER(title) LIKE &#39;%fly%&#39; OR LOWER(title) LIKE &#39;%flight%&#39; OR LOWER(title) LIKE &#39;%flying%&#39; THEN 2 ELSE 0 END +&#xA;        -- Description matches worth 1 point&#xA;        CASE WHEN LOWER(description) LIKE &#39;%fly%&#39; OR LOWER(description) LIKE &#39;%flight%&#39; OR LOWER(description) LIKE &#39;%flying%&#39; THEN 1 ELSE 0 END&#xA;    ) AS rank&#xA;FROM archive_records&#xA;WHERE &#xA;    LOWER(title) LIKE &#39;%fly%&#39; OR &#xA;    LOWER(title) LIKE &#39;%flight%&#39; OR &#xA;    LOWER(title) LIKE &#39;%flying%&#39; OR&#xA;    LOWER(description) LIKE &#39;%fly%&#39; OR &#xA;    LOWER(description) LIKE &#39;%flight%&#39; OR &#xA;    LOWER(description) LIKE &#39;%flying%&#39;&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;We can wrap it in a CTE, but that doesn&#39;t does anything differently though.&#xA;&#xA;With CTE as below, we assign a common score like `1` or `0` based on the `title` or description match. Then in the outer query, we can assign a weight to the rank if the `title` matched as `2` or `description` matched as `1` and add it up to get the same `rank` for that record.&#xA;&#xA;```sql&#xA;WITH search_results AS (&#xA;    SELECT &#xA;        id,&#xA;        title,&#xA;        description,&#xA;        CASE WHEN LOWER(title) LIKE &#39;%fly%&#39; OR LOWER(title) LIKE &#39;%flight%&#39; OR LOWER(title) LIKE &#39;%flying%&#39; &#xA;            THEN 1 ELSE 0 END AS title_match,&#xA;        CASE WHEN LOWER(description) LIKE &#39;%fly%&#39; OR LOWER(description) LIKE &#39;%flight%&#39; OR LOWER(description) LIKE &#39;%flying%&#39; &#xA;            THEN 1 ELSE 0 END AS desc_match&#xA;    FROM archive_records&#xA;    WHERE &#xA;        LOWER(title) LIKE &#39;%fly%&#39; OR LOWER(title) LIKE &#39;%flight%&#39; OR LOWER(title) LIKE &#39;%flying%&#39; OR&#xA;        LOWER(description) LIKE &#39;%fly%&#39; OR LOWER(description) LIKE &#39;%flight%&#39; OR LOWER(description) LIKE &#39;%flying%&#39;&#xA;)&#xA;SELECT &#xA;    id,&#xA;    title,&#xA;    description,&#xA;    (title_match * 2 + desc_match * 1) AS rank&#xA;FROM search_results&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;### Assign score based on matching frequency&#xA;&#xA;We can also make it better by counting how many times the term appeared in the relevant column and then compute a cumulative rank. This way it becomes more aggressive on the most relevant docs only.&#xA;&#xA;So, we write something like this.&#xA;&#xA;- Compute the length of the full column example as title with `LENGTH(LOWER(title)` then subtract the length of the word left after removing `fly`, or `flight`, or others. It calculates the difference in length between the original title and the one with &#39;fly&#39; removed. Each &#39;fly&#39; is 3 characters, so the difference divided by 3 gives the number of occurrences of &#39;fly&#39;. This `(LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;fly&#39;, &#39;&#39;))) / 3&#xA;` counts how many times &#39;fly&#39; appears. Similarly for `flight` and `flying` it is `6`, and so on.&#xA;- If that appears in title it is multiplied by `2`&#xA;- If it appears in description, it is kept as is (multiplied by `1`) but you can weight it accordingly, as there could be multiple column to weigh.&#xA;&#xA;```sql&#xA;SELECT &#xA;    id,&#xA;    title,&#xA;    description,&#xA;    (&#xA;&#xA;        ((LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;fly&#39;, &#39;&#39;))) / 3 +&#xA;         (LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;flight&#39;, &#39;&#39;))) / 6 +&#xA;         (LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;flying&#39;, &#39;&#39;))) / 6&#xA;        ) * 2&#xA;        +&#xA;&#xA;        ((LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;fly&#39;, &#39;&#39;))) / 3 +&#xA;         (LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;flight&#39;, &#39;&#39;))) / 6 +&#xA;         (LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;flying&#39;, &#39;&#39;))) / 6&#xA;        )&#xA;    ) AS rank&#xA;FROM archive_records&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;You can always add a where clause to reduce the search space upfront:&#xA;&#xA;```sql&#xA;-- Simplified frequency-based ranking&#xA;SELECT &#xA;    id,&#xA;    title,&#xA;    description,&#xA;    (&#xA;        ((LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;fly&#39;, &#39;&#39;))) / 3 +&#xA;         (LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;flight&#39;, &#39;&#39;))) / 6 +&#xA;         (LENGTH(LOWER(title)) - LENGTH(REPLACE(LOWER(title), &#39;flying&#39;, &#39;&#39;))) / 6&#xA;        ) * 2&#xA;        +&#xA;        ((LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;fly&#39;, &#39;&#39;))) / 3 +&#xA;         (LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;flight&#39;, &#39;&#39;))) / 6 +&#xA;         (LENGTH(LOWER(description)) - LENGTH(REPLACE(LOWER(description), &#39;flying&#39;, &#39;&#39;))) / 6&#xA;        )&#xA;    ) AS rank&#xA;FROM archive_records&#xA;WHERE &#xA;    LOWER(title) LIKE &#39;%fly%&#39; OR LOWER(title) LIKE &#39;%flight%&#39; OR LOWER(title) LIKE &#39;%flying%&#39; OR&#xA;    LOWER(description) LIKE &#39;%fly%&#39; OR LOWER(description) LIKE &#39;%flight%&#39; OR LOWER(description) LIKE &#39;%flying%&#39;&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;This would give us similar results but slightly different as we are assigning rank on the number of times each term appears and not just if it appears once or not.&#xA;&#xA;We can even make the list of words and their weight as a CTE and dynamically use it in the actual query:&#xA;&#xA;```sql&#xA;WITH keywords AS (&#xA;    SELECT &#39;fly&#39; AS term, 2.0 AS title_weight, 1.0 AS desc_weight&#xA;    UNION ALL&#xA;    SELECT &#39;flight&#39;, 2.0, 1.0&#xA;    UNION ALL&#xA;    SELECT &#39;flying&#39;, 2.0, 1.0&#xA;),&#xA;ranked AS (&#xA;    SELECT &#xA;        a.id,&#xA;        a.title,&#xA;        a.description,&#xA;        SUM(&#xA;            ((LENGTH(LOWER(a.title)) - LENGTH(REPLACE(LOWER(a.title), k.term, &#39;&#39;))) / LENGTH(k.term)) * k.title_weight +&#xA;            ((LENGTH(LOWER(a.description)) - LENGTH(REPLACE(LOWER(a.description), k.term, &#39;&#39;))) / LENGTH(k.term)) * k.desc_weight&#xA;        ) AS rank&#xA;    FROM archive_records a&#xA;    CROSS JOIN keywords k&#xA;    GROUP BY a.id&#xA;)&#xA;SELECT id, title, description, rank&#xA;FROM ranked&#xA;ORDER BY rank DESC, id ASC&#xA;LIMIT 5;&#xA;&#xA;```&#xA;&#xA;Here we have 2 CTEs now:&#xA;- `keywords` defining the words to search for and its weight&#xA;- `ranked` defining the computed sum of weights with frequency on the relevant columns.&#xA;&#xA;As you can see, by just adding the keyword in the CTE along with its weight, the rest of the query can work without changing anything.&#xA;&#xA;&#xA;### Full Text Search&#xA;&#xA;We can now also look at the [FTS](https://sqlite.org/fts5.html) or Full Text Search in SQLite.&#xA;&#xA;In SQLite, we can create a `VIRTUAL TABLE` which is like&#xA;- A table computed on the fly&#xA;- Doesn&#39;t exist as physical entity in the database&#xA;&#xA;We have [fts5](https://sqlite.org/fts5.html) which we can use to match, equality, or expressions in functions to search for text in columns.&#xA;&#xA;```sql&#xA;CREATE VIRTUAL TABLE IF NOT EXISTS archive_fts USING fts5(&#xA;    title, &#xA;    description,&#xA;    content=archive_records&#xA;);&#xA;```&#xA;&#xA;We need to provide the columns that we want to search against. In this case we want to search for `title` and `description` as the column. The `content=archive_records`, links the FTS table to a real table `archive_records` instead of storing its own copy of all text, the FTS table indexes the data from archive_records.&#xA;&#xA;Then we want to insert all the records from the `archive_records` for it to make aware of the records existing in the actual table.&#xA;&#xA;```sql&#xA;INSERT OR IGNORE INTO archive_fts(rowid, title, description)&#xA;SELECT id, title, description FROM archive_records;&#xA;```&#xA;&#xA;Then we can query it like so:&#xA;&#xA;```sql&#xA;SELECT * FROM archive_fts WHERE archive_fts MATCH &#39;fly&#39;;&#xA;```&#xA;&#xA;We have a couple of option to search against like&#xA;1. MATCH&#xA;2. Boolean Operator&#xA;3. Expressions and Wildcards&#xA;&#xA;We simply can filter the records with this &#xA;&#xA;```sql&#xA;SELECT *&#xA;FROM archive_fts&#xA;JOIN archive_records ON archive_fts.rowid = archive_records.id&#xA;WHERE archive_fts MATCH &#39;fly OR flight OR flying&#39;;&#xA;```&#xA;&#xA;We have added `MATCH &#39;fly OR flight OR flying&#39;` to only limit the search space on those keywords.&#xA;&#xA;&#xA;```sql&#xA;SELECT &#xA;    archive_records.id,&#xA;    archive_records.title,&#xA;    archive_records.description,&#xA;    (&#xA;        CASE WHEN archive_records.title LIKE &#39;%fly%&#39; OR archive_records.title LIKE &#39;%flight%&#39; OR archive_records.title LIKE &#39;%flying%&#39; THEN 2 ELSE 0 END +&#xA;        CASE WHEN archive_records.description LIKE &#39;%fly%&#39; OR archive_records.description LIKE &#39;%flight%&#39; OR archive_records.description LIKE &#39;%flying%&#39; THEN 1 ELSE 0 END&#xA;    ) AS rank&#xA;FROM archive_fts&#xA;JOIN archive_records ON archive_fts.rowid = archive_records.id&#xA;WHERE archive_fts MATCH &#39;fly OR flight OR flying&#39;&#xA;ORDER BY rank DESC, archive_records.id ASC&#xA;LIMIT 5;&#xA;```&#xA;&#xA;This simply will give us the relevant results by doing a fuzzy search which is way better then `LIKE` or `%` wildcard operators in the columns.&#xA;&#xA;So, that are the approaches I like taking on day 12.&#xA;&#xA;It was fun! Working with full text search for the first time.&#xA;&#xA;On to day 13!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 13: XML Travel Manifests</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-13</link>
      <description>Advent of SQL - Day 13, XML Travel Manifests Its day 13 of Advent of SQL, we have some xml to parse, which I don&#39;t think SQL can handle, but string manipulation</description>
      <pubDate>Sat, 27 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL - Day 13, XML Travel Manifests&#xA;&#xA;Its day 13 of Advent of SQL, we have some xml to parse, which I don&#39;t think SQL can handle, but string manipulation to the rescue.&#xA;&#xA;Let&#39;s get the SQL for the day:&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS travel_manifests;&#xA;&#xA;CREATE TABLE travel_manifests (&#xA;    manifest_id INT PRIMARY KEY,&#xA;    vehicle_id TEXT,&#xA;    departure_time TIMESTAMP,&#xA;    manifest_xml XML&#xA;);&#xA;&#xA;INSERT INTO travel_manifests (manifest_id, vehicle_id, departure_time, manifest_xml) VALUES&#xA;  (1, &#39;SLEIGH-01&#39;, &#39;2025-12-22 06:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;engine_check&gt;ignored&lt;/engine_check&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (2, &#39;SLEIGH-07&#39;, &#39;2025-12-23 13:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Ravi Patel&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Keiko Ito&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Anya Pavlov&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Carter Lewis&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;engine_check&gt;ignored&lt;/engine_check&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Layla Brooks&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;reindeer_mood&gt;ignored&lt;/reindeer_mood&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (3, &#39;FLIGHT-NP-9&#39;, &#39;2025-12-22 18:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Keiko Ito&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Diego Ramos&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Priya Das&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Layla Brooks&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (4, &#39;TRAIN-ICE-3&#39;, &#39;2025-12-22 18:00:00&#39;, &#39;&lt;manifest&gt;&lt;reindeer_mood&gt;low&lt;/reindeer_mood&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Isla Torres&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Ravi Patel&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Hiro Tanaka&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Priya Das&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (5, &#39;FLIGHT-NP-9&#39;, &#39;2025-12-22 17:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Mateo Cruz&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;snowfall_inches&gt;ignored&lt;/snowfall_inches&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (6, &#39;CARGO-12&#39;, &#39;2025-12-22 15:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Carter Lewis&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Hiro Tanaka&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Lucas Ford&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (7, &#39;SLEIGH-01&#39;, &#39;2025-12-22 11:00:00&#39;, &#39;&lt;manifest&gt;&lt;snack_inventory&gt;unknown&lt;/snack_inventory&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Priya Das&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Diego Ramos&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Lucas Ford&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Carter Lewis&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;reindeer_mood&gt;ignored&lt;/reindeer_mood&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Hiro Tanaka&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;weather_note&gt;ignored&lt;/weather_note&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Zara Sheikh&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (8, &#39;CARGO-12&#39;, &#39;2025-12-23 13:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Layla Brooks&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Leo Becker&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;weather_note&gt;ignored&lt;/weather_note&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Ravi Patel&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Elena Morales&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (9, &#39;SLEIGH-01&#39;, &#39;2025-12-23 10:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Bianca Pereira&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Zara Sheikh&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Elena Morales&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;reindeer_mood&gt;ignored&lt;/reindeer_mood&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Hiro Tanaka&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Keiko Ito&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (10, &#39;SLEIGH-01&#39;, &#39;2025-12-22 21:00:00&#39;, &#39;&lt;manifest&gt;&lt;snowfall_inches&gt;low&lt;/snowfall_inches&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Ava Johnson&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Priya Das&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Mateo Cruz&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Bianca Pereira&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Leo Becker&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (11, &#39;SLEIGH-07&#39;, &#39;2025-12-23 10:00:00&#39;, &#39;&lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Bianca Pereira&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Anya Pavlov&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (12, &#39;SLEIGH-01&#39;, &#39;2025-12-22 08:00:00&#39;, &#39;&lt;manifest&gt;&lt;reindeer_mood&gt;ok&lt;/reindeer_mood&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Ravi Patel&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Bianca Pereira&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Keiko Ito&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (13, &#39;FLIGHT-NP-9&#39;, &#39;2025-12-22 11:00:00&#39;, &#39;&lt;manifest&gt;&lt;snowfall_inches&gt;ok&lt;/snowfall_inches&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Elena Morales&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (14, &#39;SLEIGH-01&#39;, &#39;2025-12-22 14:00:00&#39;, &#39;&lt;manifest&gt;&lt;engine_check&gt;high&lt;/engine_check&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Layla Brooks&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;weather_note&gt;ignored&lt;/weather_note&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Anya Pavlov&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Keiko Ito&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Zara Sheikh&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;engine_check&gt;ignored&lt;/engine_check&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;),&#xA;  (15, &#39;FLIGHT-NP-9&#39;, &#39;2025-12-22 14:00:00&#39;, &#39;&lt;manifest&gt;&lt;snowfall_inches&gt;ok&lt;/snowfall_inches&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Priya Das&lt;/name&gt;&lt;ticket_class&gt;priority&lt;/ticket_class&gt;&lt;snowfall_inches&gt;ignored&lt;/snowfall_inches&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Bianca Pereira&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;&#39;);&#xA;```&#xA;&#xA;Just one table, with some wild XML.&#xA;&#xA;Let&#39;s see what do we want to do in the problem statement.&#xA;&#xA;## Problem&#xA;&#xA;&gt; Using the `travel_manifests` table, extract the passenger information from the XML data and produce a report that shows all of the departure times for &#34;CARGO&#34; vehicles that have more than 20 passengers booked. Include in the results:&#xA;&gt; &#xA;&gt; - The vehicle_id&#xA;&gt; - The departure_time&#xA;&gt; - The total number of passengers on that departure&#xA;&gt; - Order the results by departure_time.&#xA;&#xA;&#xA;Ok, so we need the number of passengers in the records which are of type `CARGO` and have more than 20 passengers booked.&#xA;&#xA;Interesting!&#xA;&#xA;Let&#39;s look at one record.&#xA;&#xA;```&#xA;sqlite&gt; .schema&#xA;CREATE TABLE travel_manifests (&#xA;    manifest_id INT PRIMARY KEY,&#xA;    vehicle_id TEXT,&#xA;    departure_time TIMESTAMP,&#xA;    manifest_xml XML&#xA;);&#xA;sqlite&gt; SELECT * FROM travel_manifests WHERE id = 1;&#xA;Parse error: no such column: id&#xA;  SELECT * FROM travel_manifests WHERE id = 1;&#xA;                         error here ---^&#xA;sqlite&gt; SELECT * FROM travel_manifests LIMIT 1;&#xA;+-------------+------------+---------------------+--------------------------------------------------------------+&#xA;| manifest_id | vehicle_id |   departure_time    |                         manifest_xml                         |&#xA;+-------------+------------+---------------------+--------------------------------------------------------------+&#xA;| 1           | SLEIGH-01  | 2025-12-22 06:00:00 | &lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;tick |&#xA;|             |            |                     | et_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;nam |&#xA;|             |            |                     | e&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;eng |&#xA;|             |            |                     | ine_check&gt;ignored&lt;/engine_check&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name |&#xA;|             |            |                     | &gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;sna |&#xA;|             |            |                     | ck_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;/passenge |&#xA;|             |            |                     | rs&gt;&lt;/manifest&gt;                                               |&#xA;+-------------+------------+---------------------+--------------------------------------------------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;So, we have the following columns &#xA;&#xA;- `vehicle_id` which is I think related to filtering `CARGO` related vehicles only&#xA;- `departure_time` which we just return as is&#xA;- `manifest_xml`, oh! This is xml and it has passenger details. &#xA;&#xA;If we look carefully, we can see the xml looks like this:&#xA;&#xA;```xml&#xA;&lt;manifest&gt;&#xA;    &lt;passengers&gt;&#xA;        &lt;passenger&gt;&#xA;            &lt;name&gt;Nia Grant&lt;/name&gt; &#xA;           &lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&#xA;        &lt;/passenger&gt;&#xA;        &lt;passenger&gt;&#xA;            &lt;name&gt;Sofia Kim&lt;/name&gt; &#xA;           &lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&#xA;           &lt;engine_check&gt;ignored&lt;/engine_check&gt;&#xA;       &lt;/passenger&gt;&#xA;       &lt;passenger&gt;&#xA;           &lt;name&gt;Jonah Wolfe&lt;/name&gt;&#xA;           &lt;ticket_class&gt;standard&lt;/ticket_class&gt;&#xA;           &lt;snack_inventory&gt;ignored&lt;/snack_inventory&gt;&#xA;       &lt;/passenger&gt;&#xA;    &lt;/passengers&gt;&#xA;&lt;/manifest&gt;&#xA;```&#xA;&#xA;We have `manifest` which has a property of `passengers` which is a list of `passenger` tags, inside of which, each element of `passenger` has its details like `name`, `ticket_class`, etc.&#xA;&#xA;We only want the count of `passengers`, how can we get that? The dirtiest way to do is to count the occurances of `&lt;passenger&gt;` or `&lt;/passenger&gt;` in the xml string. &#xA;&#xA;We can do that with counting the full length of the xml string, and then dividing by the number of times the string can be replaced(which is the dirty part, there could be hidden `&lt;passenger&gt;` string somewhere that might break this logic, but if it is a valid xml, it works). We count the number of characters left after we replace the string `&lt;passenger&gt;` with empty string `&#39;&#39;` so that we can get the difference of the total number of character and the number of characters occupied by the string `&lt;passenger&gt;`. This difference if we divide by the length of `&lt;passenger&gt;` will give us the count of the number of times the `&lt;passenger&gt;` string is present in the xml string.&#xA;&#xA;Let&#39;s take an example from the above. The length of the xml string is 374.&#xA;&#xA;```sql&#xA;SELECT LENGTH(manifest_xml) FROM travel_manifests LIMIT 1;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT LENGTH(manifest_xml) FROM travel_manifests LIMIT 1;&#xA;+----------------------+&#xA;| LENGTH(manifest_xml) |&#xA;+----------------------+&#xA;| 374                  |&#xA;+----------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Let&#39;s replace the occurances of `&lt;passenger&gt; with empty string in the `manifest_xml` string, like so:&#xA;&#xA;```sql&#xA;SELECT REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;) FROM travel_manifests LIMIT 1;&#xA;```&#xA;&#xA;Now, we can see the string `&lt;passenger&gt;` is gone from the returned result set. We can try getting its length now.&#xA;&#xA;```sql&#xA;SELECT &#xA;    LENGTH(&#xA;        REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)&#xA;    )&#xA;FROM travel_manifests LIMIT 1;&#xA;```&#xA;&#xA;Now, it says `341` why? Because we removed (replaced with empty string) the occurences of `&lt;passenger&gt;`.&#xA;&#xA;Let&#39;s get the length of `&#39;&lt;passenger&gt;&#39;` string, which should be `11` right? &#xA;&#xA;Spell it `p-a-s-s-e-n-g-e-r` as `pass` + `enger` (4+5=9) and 2 for `&lt;&gt;` so 11. Sometimes I don&#39;t know how to do math, I use SQL.&#xA;&#xA;```sql&#xA;SELECT LENGTH(&#39;&lt;passenger&gt;&#39;);&#xA;```&#xA;&#xA;There it is `11`.&#xA;&#xA;Now, if you compute the difference of the actual length of XML with the removed parts of the `&lt;passenger&gt;` what do we get?&#xA;&#xA;```sql&#xA;SELECT LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)) FROM travel_manifests LIMIT 1;&#xA;```&#xA;&#xA;WE got `33`, why would you ask because `11` times 3 is `33`. We found three instances of `&lt;passenger&gt;`,  so we just need to divide by the length of `&lt;passenger&gt;` or hard code it as `11` doesn&#39;t matter.&#xA;&#xA;We would get the number of occurances of `&lt;passenger&gt;` which will give the number of passenger in the xml string.&#xA;&#xA;```sql&#xA;SELECT&#xA;    (&#xA;        LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;))&#xA;    ) / LENGTH(&#39;&lt;passenger&gt;&#39;)&#xA;FROM travel_manifests LIMIT 1;&#xA;```&#xA;&#xA;Phew, its `3`!&#xA;&#xA;That was a lot for a simple stuff. But hey its fun!&#xA;&#xA;```&#xA;sqlite&gt; SELECT (manifest_xml) FROM travel_manifests LIMIT 1;&#xA;+--------------------------------------------------------------+&#xA;|                         manifest_xml                         |&#xA;+--------------------------------------------------------------+&#xA;| &lt;manifest&gt;&lt;passengers&gt;&lt;passenger&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;tick |&#xA;| et_class&gt;overnight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;nam |&#xA;| e&gt;Sofia Kim&lt;/name&gt;&lt;ticket_class&gt;overnight&lt;/ticket_class&gt;&lt;eng |&#xA;| ine_check&gt;ignored&lt;/engine_check&gt;&lt;/passenger&gt;&lt;passenger&gt;&lt;name |&#xA;| &gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt;standard&lt;/ticket_class&gt;&lt;sna |&#xA;| ck_inventory&gt;ignored&lt;/snack_inventory&gt;&lt;/passenger&gt;&lt;/passenge |&#xA;| rs&gt;&lt;/manifest&gt;                                               |&#xA;+--------------------------------------------------------------+&#xA;&#xA;sqlite&gt; SELECT REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;) FROM travel_manifests LIMIT 1;&#xA;+--------------------------------------------------------------+&#xA;|           REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)           |&#xA;+--------------------------------------------------------------+&#xA;| &lt;manifest&gt;&lt;passengers&gt;&lt;name&gt;Nia Grant&lt;/name&gt;&lt;ticket_class&gt;ov |&#xA;| ernight&lt;/ticket_class&gt;&lt;/passenger&gt;&lt;name&gt;Sofia Kim&lt;/name&gt;&lt;tic |&#xA;| ket_class&gt;overnight&lt;/ticket_class&gt;&lt;engine_check&gt;ignored&lt;/eng |&#xA;| ine_check&gt;&lt;/passenger&gt;&lt;name&gt;Jonah Wolfe&lt;/name&gt;&lt;ticket_class&gt; |&#xA;| standard&lt;/ticket_class&gt;&lt;snack_inventory&gt;ignored&lt;/snack_inven |&#xA;| tory&gt;&lt;/passenger&gt;&lt;/passengers&gt;&lt;/manifest&gt;                    |&#xA;+--------------------------------------------------------------+&#xA;sqlite&gt; &#xA;sqlite&gt; SELECT LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)) FROM travel_manifests LIMIT 1;&#xA;+--------------------------------------------------+&#xA;| LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)) |&#xA;+--------------------------------------------------+&#xA;| 341                                              |&#xA;+--------------------------------------------------+&#xA;sqlite&gt; SELECT LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)) FROM travel_manifests LIMIT 1;&#xA;+--------------------------------------------------------------+&#xA;| LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passen |&#xA;+--------------------------------------------------------------+&#xA;| 33                                                           |&#xA;+--------------------------------------------------------------+&#xA;sqlite&gt; SELECT LENGTH(&#39;&lt;passenger&gt;&#39;) FROM travel_manifests LIMIT 1;&#xA;+-----------------------+&#xA;| LENGTH(&#39;&lt;passenger&gt;&#39;) |&#xA;+-----------------------+&#xA;| 11                    |&#xA;+-----------------------+&#xA;&#xA;sqlite&gt; SELECT (LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;)))/LENGTH(&#39;&lt;passenger&gt;&#39;) FROM travel_manifests LIMIT 1;&#xA;+--------------------------------------------------------------+&#xA;| (LENGTH(manifest_xml) - LENGTH(REPLACE(manifest_xml, &#39;&lt;passe |&#xA;+--------------------------------------------------------------+&#xA;| 3                                                            |&#xA;+--------------------------------------------------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Now, let construct the query to get the number of passengers.&#xA;&#xA;```sql&#xA;SELECT&#xA;    vehicle_id,&#xA;    departure_time,&#xA;    (&#xA;        LENGTH(manifest_xml)&#xA;        - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;))&#xA;    ) / LENGTH(&#39;&lt;passenger&gt;&#39;) AS passengers_in_manifest&#xA;FROM travel_manifests&#xA;WHERE vehicle_id LIKE &#39;CARGO-%&#39;;&#xA;```&#xA;&#xA;We need to wrap it in a CTE to grab and group by the vehicle_id I believe, as there are similar entries.&#xA;&#xA;Also we need to group the records with the same departure time, so that we can combine the number of passengers for that vehicle.&#xA;&#xA;&#xA;```sql&#xA;WITH passenger_counts AS (&#xA;    SELECT&#xA;        vehicle_id,&#xA;        departure_time,&#xA;        (&#xA;            LENGTH(manifest_xml)&#xA;            - LENGTH(REPLACE(manifest_xml, &#39;&lt;passenger&gt;&#39;, &#39;&#39;))&#xA;        ) / LENGTH(&#39;&lt;passenger&gt;&#39;) AS passengers_in_manifest&#xA;    FROM travel_manifests&#xA;    WHERE vehicle_id LIKE &#39;CARGO-%&#39;&#xA;)&#xA;SELECT&#xA;    vehicle_id,&#xA;    departure_time,&#xA;    SUM(passengers_in_manifest) AS total_passengers&#xA;FROM passenger_counts&#xA;GROUP BY vehicle_id, departure_time&#xA;HAVING SUM(passengers_in_manifest) &gt; 20&#xA;ORDER BY departure_time;&#xA;```&#xA;&#xA;We count the number of passengers in the CTE and use it as a filter in the outer query as `SUM(passengers_in_manifest) &gt; 20` which will give the right condition for us to get the result. We have to use `HAVING` as we need to do that after grouping the same `vehicle_id` and records across same `departure_time`.&#xA;&#xA;We also use the `vehicle_id LIKE &#39;CARGO-%&#39;` in the CTE to filter it right at the inner query to avoid looping in all the queries for computing the number of passengers. &#xA;&#xA;&#xA;That solves this problem.&#xA;&#xA;&#xA;That&#39;s it from day 13 of Advent of SQL.&#xA;&#xA;There are other ways, but its the same parsing, We can use JOINs and stuff, but hey that was not the point of this.&#xA;&#xA;Anyways! See you tomorrow for day 14!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 11: Behavior Score</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-11</link>
      <description>Advent of SQL - Day 11, Behavior Score All right, this is day 11 from Advent of SQL. Let&#39;s pull in the data. No hiccups! Good to go. We just have one table toda</description>
      <pubDate>Fri, 26 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL - Day 11, Behavior Score&#xA;&#xA;All right, this is day 11 from Advent of SQL.&#xA;&#xA;Let&#39;s pull in the data.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS behavior_logs;&#xA;&#xA;CREATE TABLE behavior_logs (&#xA;    id INT PRIMARY KEY,&#xA;    child_id INT,&#xA;    child_name TEXT,&#xA;    behavior_date DATE,&#xA;    score INT&#xA;);&#xA;&#xA;INSERT INTO behavior_logs (id, child_id, child_name, behavior_date, score) VALUES&#xA;    (1, 1, &#39;Emma D.&#39;, &#39;2025-12-01&#39;, 5),&#xA;    (2, 1, &#39;Emma D.&#39;, &#39;2025-12-02&#39;, 1),&#xA;    (3, 1, &#39;Emma D.&#39;, &#39;2025-12-03&#39;, 3),&#xA;    (4, 1, &#39;Emma D.&#39;, &#39;2025-12-04&#39;, 5),&#xA;    (5, 1, &#39;Emma D.&#39;, &#39;2025-12-05&#39;, 2),&#xA;    (6, 1, &#39;Emma D.&#39;, &#39;2025-12-06&#39;, 2),&#xA;    (7, 1, &#39;Emma D.&#39;, &#39;2025-12-07&#39;, 3),&#xA;    (8, 1, &#39;Emma D.&#39;, &#39;2025-12-08&#39;, 5),&#xA;    (9, 1, &#39;Emma D.&#39;, &#39;2025-12-09&#39;, 4),&#xA;    (10, 1, &#39;Emma D.&#39;, &#39;2025-12-10&#39;, 5),&#xA;    (11, 1, &#39;Emma D.&#39;, &#39;2025-12-11&#39;, 5),&#xA;    (12, 1, &#39;Emma D.&#39;, &#39;2025-12-12&#39;, -1),&#xA;    (13, 1, &#39;Emma D.&#39;, &#39;2025-12-13&#39;, 1),&#xA;    (14, 1, &#39;Emma D.&#39;, &#39;2025-12-14&#39;, 1),&#xA;    (15, 1, &#39;Emma D.&#39;, &#39;2025-12-15&#39;, -1),&#xA;    (16, 1, &#39;Emma D.&#39;, &#39;2025-12-16&#39;, 3),&#xA;    (17, 1, &#39;Emma D.&#39;, &#39;2025-12-17&#39;, -2),&#xA;    (18, 1, &#39;Emma D.&#39;, &#39;2025-12-18&#39;, 1),&#xA;    (19, 1, &#39;Emma D.&#39;, &#39;2025-12-19&#39;, 1),&#xA;    (20, 1, &#39;Emma D.&#39;, &#39;2025-12-20&#39;, -2),&#xA;    (21, 2, &#39;Ava X.&#39;, &#39;2025-12-01&#39;, 0),&#xA;    (22, 2, &#39;Ava X.&#39;, &#39;2025-12-02&#39;, -1),&#xA;    (23, 2, &#39;Ava X.&#39;, &#39;2025-12-03&#39;, 4),&#xA;    (24, 2, &#39;Ava X.&#39;, &#39;2025-12-04&#39;, 0),&#xA;    (25, 2, &#39;Ava X.&#39;, &#39;2025-12-05&#39;, 2),&#xA;    (26, 2, &#39;Ava X.&#39;, &#39;2025-12-06&#39;, 3),&#xA;    (27, 2, &#39;Ava X.&#39;, &#39;2025-12-07&#39;, 5),&#xA;    (28, 2, &#39;Ava X.&#39;, &#39;2025-12-08&#39;, 2),&#xA;    (29, 2, &#39;Ava X.&#39;, &#39;2025-12-09&#39;, 1),&#xA;    (30, 2, &#39;Ava X.&#39;, &#39;2025-12-10&#39;, 5),&#xA;    (31, 2, &#39;Ava X.&#39;, &#39;2025-12-11&#39;, 2),&#xA;    (32, 2, &#39;Ava X.&#39;, &#39;2025-12-12&#39;, 5),&#xA;    (33, 2, &#39;Ava X.&#39;, &#39;2025-12-13&#39;, 5),&#xA;    (34, 2, &#39;Ava X.&#39;, &#39;2025-12-14&#39;, 2),&#xA;    (35, 2, &#39;Ava X.&#39;, &#39;2025-12-15&#39;, 0),&#xA;    (36, 2, &#39;Ava X.&#39;, &#39;2025-12-16&#39;, 0),&#xA;    (37, 2, &#39;Ava X.&#39;, &#39;2025-12-17&#39;, 5),&#xA;    (38, 2, &#39;Ava X.&#39;, &#39;2025-12-18&#39;, 4),&#xA;    (39, 2, &#39;Ava X.&#39;, &#39;2025-12-19&#39;, 5),&#xA;    (40, 2, &#39;Ava X.&#39;, &#39;2025-12-20&#39;, 5),&#xA;    (181, 10, &#39;Ava C.&#39;, &#39;2025-12-01&#39;, 3),&#xA;    (182, 10, &#39;Ava C.&#39;, &#39;2025-12-02&#39;, 0),&#xA;    (183, 10, &#39;Ava C.&#39;, &#39;2025-12-03&#39;, 3),&#xA;    (184, 10, &#39;Ava C.&#39;, &#39;2025-12-04&#39;, 5),&#xA;    (185, 10, &#39;Ava C.&#39;, &#39;2025-12-05&#39;, 5),&#xA;    (186, 10, &#39;Ava C.&#39;, &#39;2025-12-06&#39;, 4),&#xA;    (187, 10, &#39;Ava C.&#39;, &#39;2025-12-07&#39;, 1),&#xA;    (188, 10, &#39;Ava C.&#39;, &#39;2025-12-08&#39;, 4),&#xA;    (189, 10, &#39;Ava C.&#39;, &#39;2025-12-09&#39;, 5),&#xA;    (190, 10, &#39;Ava C.&#39;, &#39;2025-12-10&#39;, 5),&#xA;    (191, 10, &#39;Ava C.&#39;, &#39;2025-12-11&#39;, 5),&#xA;    (192, 10, &#39;Ava C.&#39;, &#39;2025-12-12&#39;, 0),&#xA;    (193, 10, &#39;Ava C.&#39;, &#39;2025-12-13&#39;, 0),&#xA;    (194, 10, &#39;Ava C.&#39;, &#39;2025-12-14&#39;, 3),&#xA;    (195, 10, &#39;Ava C.&#39;, &#39;2025-12-15&#39;, 1),&#xA;    (196, 10, &#39;Ava C.&#39;, &#39;2025-12-16&#39;, 3),&#xA;    (197, 10, &#39;Ava C.&#39;, &#39;2025-12-17&#39;, -1),&#xA;    (198, 10, &#39;Ava C.&#39;, &#39;2025-12-18&#39;, 0),&#xA;    (199, 10, &#39;Ava C.&#39;, &#39;2025-12-19&#39;, 5),&#xA;    (200, 10, &#39;Ava C.&#39;, &#39;2025-12-20&#39;, 4),&#xA;    (241, 13, &#39;Ava R.&#39;, &#39;2025-12-01&#39;, 3),&#xA;    (242, 13, &#39;Ava R.&#39;, &#39;2025-12-02&#39;, 2),&#xA;    (243, 13, &#39;Ava R.&#39;, &#39;2025-12-03&#39;, 2),&#xA;    (244, 13, &#39;Ava R.&#39;, &#39;2025-12-04&#39;, 1),&#xA;    (245, 13, &#39;Ava R.&#39;, &#39;2025-12-05&#39;, -1),&#xA;    (246, 13, &#39;Ava R.&#39;, &#39;2025-12-06&#39;, -1),&#xA;    (247, 13, &#39;Ava R.&#39;, &#39;2025-12-07&#39;, 2),&#xA;    (248, 13, &#39;Ava R.&#39;, &#39;2025-12-08&#39;, 5),&#xA;    (249, 13, &#39;Ava R.&#39;, &#39;2025-12-09&#39;, 0),&#xA;    (250, 13, &#39;Ava R.&#39;, &#39;2025-12-10&#39;, 5),&#xA;    (251, 13, &#39;Ava R.&#39;, &#39;2025-12-11&#39;, 2),&#xA;    (252, 13, &#39;Ava R.&#39;, &#39;2025-12-12&#39;, -1),&#xA;    (253, 13, &#39;Ava R.&#39;, &#39;2025-12-13&#39;, 2),&#xA;    (254, 13, &#39;Ava R.&#39;, &#39;2025-12-14&#39;, 3),&#xA;    (255, 13, &#39;Ava R.&#39;, &#39;2025-12-15&#39;, 2),&#xA;    (256, 13, &#39;Ava R.&#39;, &#39;2025-12-16&#39;, -1),&#xA;    (257, 13, &#39;Ava R.&#39;, &#39;2025-12-17&#39;, -2),&#xA;    (258, 13, &#39;Ava R.&#39;, &#39;2025-12-18&#39;, -4),&#xA;    (259, 13, &#39;Ava R.&#39;, &#39;2025-12-19&#39;, -3),&#xA;    (260, 13, &#39;Ava R.&#39;, &#39;2025-12-20&#39;, 2),&#xA;    (1961, 99, &#39;Ava X.&#39;, &#39;2025-12-01&#39;, 2),&#xA;    (1962, 99, &#39;Ava X.&#39;, &#39;2025-12-02&#39;, -2),&#xA;    (1963, 99, &#39;Ava X.&#39;, &#39;2025-12-03&#39;, -1),&#xA;    (1964, 99, &#39;Ava X.&#39;, &#39;2025-12-04&#39;, -2),&#xA;    (1965, 99, &#39;Ava X.&#39;, &#39;2025-12-05&#39;, 3),&#xA;    (1966, 99, &#39;Ava X.&#39;, &#39;2025-12-06&#39;, -1),&#xA;    (1967, 99, &#39;Ava X.&#39;, &#39;2025-12-07&#39;, 0),&#xA;    (1968, 99, &#39;Ava X.&#39;, &#39;2025-12-08&#39;, 1),&#xA;    (1969, 99, &#39;Ava X.&#39;, &#39;2025-12-09&#39;, 0),&#xA;    (1970, 99, &#39;Ava X.&#39;, &#39;2025-12-10&#39;, 0),&#xA;    (1971, 99, &#39;Ava X.&#39;, &#39;2025-12-11&#39;, 3),&#xA;    (1972, 99, &#39;Ava X.&#39;, &#39;2025-12-12&#39;, 4),&#xA;    (1973, 99, &#39;Ava X.&#39;, &#39;2025-12-13&#39;, 4),&#xA;    (1974, 99, &#39;Ava X.&#39;, &#39;2025-12-14&#39;, 0),&#xA;    (1975, 99, &#39;Ava X.&#39;, &#39;2025-12-15&#39;, 3),&#xA;    (1976, 99, &#39;Ava X.&#39;, &#39;2025-12-16&#39;, -1),&#xA;    (1977, 99, &#39;Ava X.&#39;, &#39;2025-12-17&#39;, -1),&#xA;    (1978, 99, &#39;Ava X.&#39;, &#39;2025-12-18&#39;, 3),&#xA;    (1979, 99, &#39;Ava X.&#39;, &#39;2025-12-19&#39;, 3),&#xA;    (1980, 99, &#39;Ava X.&#39;, &#39;2025-12-20&#39;, -3);&#xA;```&#xA;&#xA;No hiccups! Good to go.&#xA;&#xA;We just have one table today.&#xA;&#xA;```sql&#xA; SELECT * FROM behavior_logs;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; .read day11-inserts.sql&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; .schema&#xA;CREATE TABLE behavior_logs (&#xA;    id INT PRIMARY KEY,&#xA;    child_id INT,&#xA;    child_name TEXT,&#xA;    behavior_date DATE,&#xA;    score INT&#xA;);&#xA;sqlite&gt; SELECT * FROM behavior_logs LIMIT 20;&#xA;+----+----------+------------+---------------+-------+&#xA;| id | child_id | child_name | behavior_date | score |&#xA;+----+----------+------------+---------------+-------+&#xA;| 1  | 1        | Emma D.    | 2025-12-01    | 5     |&#xA;| 2  | 1        | Emma D.    | 2025-12-02    | 1     |&#xA;| 3  | 1        | Emma D.    | 2025-12-03    | 3     |&#xA;| 4  | 1        | Emma D.    | 2025-12-04    | 5     |&#xA;| 5  | 1        | Emma D.    | 2025-12-05    | 2     |&#xA;| 6  | 1        | Emma D.    | 2025-12-06    | 2     |&#xA;| 7  | 1        | Emma D.    | 2025-12-07    | 3     |&#xA;| 8  | 1        | Emma D.    | 2025-12-08    | 5     |&#xA;| 9  | 1        | Emma D.    | 2025-12-09    | 4     |&#xA;| 10 | 1        | Emma D.    | 2025-12-10    | 5     |&#xA;| 11 | 1        | Emma D.    | 2025-12-11    | 5     |&#xA;| 12 | 1        | Emma D.    | 2025-12-12    | -1    |&#xA;| 13 | 1        | Emma D.    | 2025-12-13    | 1     |&#xA;| 14 | 1        | Emma D.    | 2025-12-14    | 1     |&#xA;| 15 | 1        | Emma D.    | 2025-12-15    | -1    |&#xA;| 16 | 1        | Emma D.    | 2025-12-16    | 3     |&#xA;| 17 | 1        | Emma D.    | 2025-12-17    | -2    |&#xA;| 18 | 1        | Emma D.    | 2025-12-18    | 1     |&#xA;| 19 | 1        | Emma D.    | 2025-12-19    | 1     |&#xA;| 20 | 1        | Emma D.    | 2025-12-20    | -2    |&#xA;+----+----------+------------+---------------+-------+&#xA;sqlite&gt; SELECT * FROM behavior_logs WHERE child_name LIKE &#39;Ava&#39;;&#xA;sqlite&gt; SELECT * FROM behavior_logs WHERE child_name LIKE &#39;Ava%&#39;;&#xA;+------+----------+------------+---------------+-------+&#xA;|  id  | child_id | child_name | behavior_date | score |&#xA;+------+----------+------------+---------------+-------+&#xA;| 21   | 2        | Ava X.     | 2025-12-01    | 0     |&#xA;| 22   | 2        | Ava X.     | 2025-12-02    | -1    |&#xA;| 23   | 2        | Ava X.     | 2025-12-03    | 4     |&#xA;| 24   | 2        | Ava X.     | 2025-12-04    | 0     |&#xA;| 25   | 2        | Ava X.     | 2025-12-05    | 2     |&#xA;| 26   | 2        | Ava X.     | 2025-12-06    | 3     |&#xA;| 27   | 2        | Ava X.     | 2025-12-07    | 5     |&#xA;| 28   | 2        | Ava X.     | 2025-12-08    | 2     |&#xA;| 29   | 2        | Ava X.     | 2025-12-09    | 1     |&#xA;| 30   | 2        | Ava X.     | 2025-12-10    | 5     |&#xA;| 31   | 2        | Ava X.     | 2025-12-11    | 2     |&#xA;| 32   | 2        | Ava X.     | 2025-12-12    | 5     |&#xA;| 33   | 2        | Ava X.     | 2025-12-13    | 5     |&#xA;| 34   | 2        | Ava X.     | 2025-12-14    | 2     |&#xA;| 35   | 2        | Ava X.     | 2025-12-15    | 0     |&#xA;| 36   | 2        | Ava X.     | 2025-12-16    | 0     |&#xA;| 37   | 2        | Ava X.     | 2025-12-17    | 5     |&#xA;| 38   | 2        | Ava X.     | 2025-12-18    | 4     |&#xA;| 39   | 2        | Ava X.     | 2025-12-19    | 5     |&#xA;| 40   | 2        | Ava X.     | 2025-12-20    | 5     |&#xA;| 181  | 10       | Ava C.     | 2025-12-01    | 3     |&#xA;| 182  | 10       | Ava C.     | 2025-12-02    | 0     |&#xA;| 183  | 10       | Ava C.     | 2025-12-03    | 3     |&#xA;| 184  | 10       | Ava C.     | 2025-12-04    | 5     |&#xA;| 185  | 10       | Ava C.     | 2025-12-05    | 5     |&#xA;| 186  | 10       | Ava C.     | 2025-12-06    | 4     |&#xA;| 187  | 10       | Ava C.     | 2025-12-07    | 1     |&#xA;| 188  | 10       | Ava C.     | 2025-12-08    | 4     |&#xA;| 189  | 10       | Ava C.     | 2025-12-09    | 5     |&#xA;| 190  | 10       | Ava C.     | 2025-12-10    | 5     |&#xA;| 191  | 10       | Ava C.     | 2025-12-11    | 5     |&#xA;| 192  | 10       | Ava C.     | 2025-12-12    | 0     |&#xA;| 193  | 10       | Ava C.     | 2025-12-13    | 0     |&#xA;| 194  | 10       | Ava C.     | 2025-12-14    | 3     |&#xA;| 195  | 10       | Ava C.     | 2025-12-15    | 1     |&#xA;| 196  | 10       | Ava C.     | 2025-12-16    | 3     |&#xA;| 197  | 10       | Ava C.     | 2025-12-17    | -1    |&#xA;| 198  | 10       | Ava C.     | 2025-12-18    | 0     |&#xA;| 199  | 10       | Ava C.     | 2025-12-19    | 5     |&#xA;| 200  | 10       | Ava C.     | 2025-12-20    | 4     |&#xA;| 241  | 13       | Ava R.     | 2025-12-01    | 3     |&#xA;| 242  | 13       | Ava R.     | 2025-12-02    | 2     |&#xA;| 243  | 13       | Ava R.     | 2025-12-03    | 2     |&#xA;| 244  | 13       | Ava R.     | 2025-12-04    | 1     |&#xA;| 245  | 13       | Ava R.     | 2025-12-05    | -1    |&#xA;| 246  | 13       | Ava R.     | 2025-12-06    | -1    |&#xA;| 247  | 13       | Ava R.     | 2025-12-07    | 2     |&#xA;| 248  | 13       | Ava R.     | 2025-12-08    | 5     |&#xA;| 249  | 13       | Ava R.     | 2025-12-09    | 0     |&#xA;| 250  | 13       | Ava R.     | 2025-12-10    | 5     |&#xA;| 251  | 13       | Ava R.     | 2025-12-11    | 2     |&#xA;| 252  | 13       | Ava R.     | 2025-12-12    | -1    |&#xA;| 253  | 13       | Ava R.     | 2025-12-13    | 2     |&#xA;| 254  | 13       | Ava R.     | 2025-12-14    | 3     |&#xA;| 255  | 13       | Ava R.     | 2025-12-15    | 2     |&#xA;| 256  | 13       | Ava R.     | 2025-12-16    | -1    |&#xA;| 257  | 13       | Ava R.     | 2025-12-17    | -2    |&#xA;| 258  | 13       | Ava R.     | 2025-12-18    | -4    |&#xA;| 259  | 13       | Ava R.     | 2025-12-19    | -3    |&#xA;| 260  | 13       | Ava R.     | 2025-12-20    | 2     |&#xA;| 1961 | 99       | Ava X.     | 2025-12-01    | 2     |&#xA;| 1962 | 99       | Ava X.     | 2025-12-02    | -2    |&#xA;| 1963 | 99       | Ava X.     | 2025-12-03    | -1    |&#xA;| 1964 | 99       | Ava X.     | 2025-12-04    | -2    |&#xA;| 1965 | 99       | Ava X.     | 2025-12-05    | 3     |&#xA;| 1966 | 99       | Ava X.     | 2025-12-06    | -1    |&#xA;| 1967 | 99       | Ava X.     | 2025-12-07    | 0     |&#xA;| 1968 | 99       | Ava X.     | 2025-12-08    | 1     |&#xA;| 1969 | 99       | Ava X.     | 2025-12-09    | 0     |&#xA;| 1970 | 99       | Ava X.     | 2025-12-10    | 0     |&#xA;| 1971 | 99       | Ava X.     | 2025-12-11    | 3     |&#xA;| 1972 | 99       | Ava X.     | 2025-12-12    | 4     |&#xA;| 1973 | 99       | Ava X.     | 2025-12-13    | 4     |&#xA;| 1974 | 99       | Ava X.     | 2025-12-14    | 0     |&#xA;| 1975 | 99       | Ava X.     | 2025-12-15    | 3     |&#xA;| 1976 | 99       | Ava X.     | 2025-12-16    | -1    |&#xA;| 1977 | 99       | Ava X.     | 2025-12-17    | -1    |&#xA;| 1978 | 99       | Ava X.     | 2025-12-18    | 3     |&#xA;| 1979 | 99       | Ava X.     | 2025-12-19    | 3     |&#xA;| 1980 | 99       | Ava X.     | 2025-12-20    | -3    |&#xA;+------+----------+------------+---------------+-------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Let&#39;s get to the problem of day 11&#xA;&#xA;## Problem&#xA;&#xA;&gt; Calculate the 7-day rolling average behavior score for each child. Identify any child whose rolling average drops below 0. For those children with a rolling average below 0, return the `child_id`, `child_name`, `behavior_date` (this will be the latest date in the 7-day rolling average), and the calculated 7-day rolling average. Only include results with a `behavior_date` of `December 7, 2025` or later, ensuring that each rolling average is based on a full 7 days of data.&#xA;&gt; &#xA;&gt; Order the results by `behavior_date` and then `child_name`.&#xA;&#xA;So, we need to do what?&#xA;&#xA;- Group by child_id&#xA;- Compute the rolling average for the past 7 days (so if there is a data for 20 days from 1st to 20th December, we&#39;ll only consider the average from 14th to 20th December)&#xA;- Order by behavior_date and child_name.&#xA;&#xA;### Using Simple Join&#xA;&#xA;We have to first grab the rolling average which is only for the past 7 days per child.&#xA;&#xA;So, to do that, we can self join the `behavior_log` table on the condition that the behavior date (from right table) is between the current behavior date (from left table) and 6 days past that for each child.&#xA;&#xA;```sql&#xA;SELECT &#xA;    behavior_logs.child_id,&#xA;    behavior_logs.child_name,&#xA;    behavior_logs.behavior_date,&#xA;    AVG(week_logs.score) AS rolling_avg&#xA;FROM &#xA;    behavior_logs&#xA;JOIN &#xA;    behavior_logs week_logs&#xA;    ON behavior_logs.child_id = week_logs.child_id&#xA;    AND week_logs.behavior_date BETWEEN DATE(behavior_logs.behavior_date, &#39;-6 days&#39;) AND behavior_logs.behavior_date&#xA;GROUP BY &#xA;    behavior_logs.child_id;&#xA;```&#xA;&#xA;We basically joined the table `behavior_logs` with itself, i.e. the right and left tables are the same. So, we join on the condition of same child_id, and then look for dates between the past 6 days and the current behavior date (we included the 7th day in the behavior date, so hence subtracting 6 days from that date).&#xA;&#xA;We group by `child_id`, so that we get a single row for each `child_id`. We compute the `AVG(score)` to get the average score for each day in the range of the past 7 days.&#xA;&#xA;You see the issue, we need to compute the average for each week, and not just the last, it will give us the last most score for a child. We need to also group by the behavior date so as to keep week unique and have multiple entries for a child.&#xA;&#xA;```sql&#xA;SELECT &#xA;    behavior_logs.child_id,&#xA;    behavior_logs.child_name,&#xA;    behavior_logs.behavior_date,&#xA;    AVG(week_logs.score) AS rolling_avg&#xA;FROM &#xA;    behavior_logs&#xA;JOIN &#xA;    behavior_logs week_logs&#xA;    ON behavior_logs.child_id = week_logs.child_id&#xA;    AND week_logs.behavior_date BETWEEN DATE(behavior_logs.behavior_date, &#39;-6 days&#39;) AND behavior_logs.behavior_date&#xA;GROUP BY &#xA;    behavior_logs.child_id, behavior_logs.behavior_date;&#xA;```&#xA;&#xA;Now, we have grouped by `behavior_date` so that each child can have multiple entries for the weeks that we have logs for. Now we can further filter it.&#xA;&#xA;We also need to filter when the `rolling_average` is less than 0, that is the child had a bad week overall. So that will be in a `HAVING` condition and not a `WHERE` condition since `AVG` is a aggregate function, we can&#39;t reference the `rolling_average` in the where clause, it won&#39;t be defined there yet. So, we use `HAVING` to filter `rolling_average` as less than `0`.&#xA;&#xA;```sql&#xA;SELECT &#xA;    behavior_logs.child_id,&#xA;    behavior_logs.child_name,&#xA;    behavior_logs.behavior_date,&#xA;    AVG(week_logs.score) AS rolling_avg&#xA;FROM &#xA;    behavior_logs&#xA;JOIN &#xA;    behavior_logs week_logs&#xA;    ON behavior_logs.child_id = week_logs.child_id&#xA;    AND week_logs.behavior_date BETWEEN DATE(behavior_logs.behavior_date, &#39;-6 days&#39;) AND behavior_logs.behavior_date&#xA;GROUP BY &#xA;    behavior_logs.child_id, behavior_logs.behavior_date&#xA;HAVING &#xA;    rolling_avg &lt; 0;&#xA;```&#xA;&#xA;We are now down to the rows that only have `bad` weeks for the child in various weeks. &#xA;&#xA;There is one catch however, we can&#39;t use week periods before `7th December` because we don&#39;t have enough data to compute the rolling average for the 7 day score. Hence we only include the records having logs after `7th December`.&#xA;&#xA;This again would come in the `HAVING` clause as we are deciding the final behavior date from the week and not the individual logs from the table.&#xA;&#xA;```sql&#xA;SELECT &#xA;    behavior_logs.child_id,&#xA;    behavior_logs.child_name,&#xA;    behavior_logs.behavior_date,&#xA;    AVG(week_logs.score) AS rolling_avg&#xA;FROM &#xA;    behavior_logs&#xA;JOIN &#xA;    behavior_logs week_logs&#xA;    ON behavior_logs.child_id = week_logs.child_id&#xA;    AND week_logs.behavior_date BETWEEN DATE(behavior_logs.behavior_date, &#39;-6 days&#39;) AND behavior_logs.behavior_date&#xA;GROUP BY &#xA;    behavior_logs.child_id, behavior_logs.behavior_date&#xA;HAVING &#xA;    rolling_avg &lt; 0&#xA;    AND behavior_logs.behavior_date &gt;= &#39;2025-12-07&#39;&#xA;&#xA;```&#xA;&#xA;Now, the final piece is the order.&#xA;&#xA;We need to order it by the `behavior_date` and the `child_name` as requested.&#xA;&#xA;```sql&#xA;SELECT &#xA;    behavior_logs.child_id,&#xA;    behavior_logs.child_name,&#xA;    behavior_logs.behavior_date,&#xA;    AVG(week_logs.score) AS rolling_avg&#xA;FROM &#xA;    behavior_logs&#xA;JOIN &#xA;    behavior_logs week_logs&#xA;    ON behavior_logs.child_id = week_logs.child_id&#xA;    AND week_logs.behavior_date BETWEEN DATE(behavior_logs.behavior_date, &#39;-6 days&#39;) AND behavior_logs.behavior_date&#xA;GROUP BY &#xA;    behavior_logs.child_id, behavior_logs.behavior_date&#xA;HAVING &#xA;    rolling_avg &lt; 0&#xA;    AND behavior_logs.behavior_date &gt;= &#39;2025-12-07&#39;&#xA;ORDER BY &#xA;    behavior_logs.behavior_date, behavior_logs.child_name;&#xA;```&#xA;&#xA;This is a simple solution.&#xA;Easy to understand, and explain, not quite short and crisp.&#xA;&#xA;&#xA;We can even do this in a sub query without a JOIN like so&#xA;&#xA;### Using Sub-query&#xA;&#xA;We take the conditions from the `JOIN` and replace that in a sub-query like so.&#xA;&#xA;```sql&#xA;SELECT&#xA;    current_logs.child_id,&#xA;    current_logs.child_name,&#xA;    current_logs.behavior_date,&#xA;    (&#xA;        SELECT AVG(past_logs.score)&#xA;        FROM behavior_logs past_logs&#xA;        WHERE past_logs.child_id = current_logs.child_id&#xA;          AND past_logs.behavior_date&#xA;              BETWEEN date(current_logs.behavior_date, &#39;-6 days&#39;)&#xA;                  AND current_logs.behavior_date&#xA;    ) AS rolling_avg&#xA;FROM behavior_logs current_logs&#xA;WHERE current_logs.behavior_date &gt;= &#39;2025-12-07&#39;&#xA;  AND rolling_avg &lt; 0&#xA;ORDER BY current_logs.behavior_date, current_logs.child_name;&#xA;&#xA;```&#xA;&#xA;Since its a subquery, we don&#39;t need to group by or add a having clause to filter the rolling average and the behavior date.&#xA;&#xA;### Using Sub-query and Window Function&#xA;&#xA;We can take the above query and instead of join, we can write a sub-query to compute the rolling average using window function.&#xA;&#xA;```sql&#xA;SELECT&#xA;    child_id,&#xA;    child_name,&#xA;    behavior_date,&#xA;    AVG(score) OVER (&#xA;        PARTITION BY child_id&#xA;        ORDER BY behavior_date&#xA;        ROWS BETWEEN 6 PRECEDING AND CURRENT ROW&#xA;    ) AS rolling_avg&#xA;FROM behavior_logs&#xA;&#xA;```&#xA;&#xA;We use the `AVG(score) OVER ()` this is a partition. We partition or create a window for each child and we order by the behavior date and then create a sliding window for the past 7 days. &#xA;&#xA;The `ROWS BETWEEN 6 PRECEDING AND CURRENT ROW` defines a 7-row sliding window for each child. Every row gets its own window, and each window is separate per child because of PARTITION BY `child_id`.&#xA;&#xA;So we get a full per day rolling average for each child with this query.&#xA;&#xA;This had around 2400 rows.&#xA;&#xA;Now, we need filter it down to the only rows where the rolling average is less than `0` and we don&#39;t include the average for days before the `7th December` as we don&#39;t have enough days before that to compute the 7-day rolling average.&#xA;&#xA;But this is a query in itself we can&#39;t reference the `rolling_average` in the where clause as its not available there. We haven&#39;t grouped by anything explicitly so we can&#39;t use `HAVING`, we need to wrap it in a sub query to get the rolling average as well as the behavior date condition.&#xA;&#xA;```sql&#xA;SELECT&#xA;    child_id,&#xA;    child_name,&#xA;    behavior_date,&#xA;    rolling_avg&#xA;FROM (&#xA;    SELECT&#xA;        child_id,&#xA;        child_name,&#xA;        behavior_date,&#xA;        AVG(score) OVER (&#xA;            PARTITION BY child_id&#xA;            ORDER BY behavior_date&#xA;            ROWS BETWEEN 6 PRECEDING AND CURRENT ROW&#xA;        ) AS rolling_avg&#xA;    FROM behavior_logs&#xA;)&#xA;WHERE behavior_date &gt;= &#39;2025-12-07&#39;&#xA;  AND rolling_avg &lt; 0;&#xA;```&#xA;&#xA;This will filter down the rows.&#xA;&#xA;Now, we also need to order by the `behavior_date` and then `child_name`.&#xA;&#xA;```sql&#xA;SELECT&#xA;    child_id,&#xA;    child_name,&#xA;    behavior_date,&#xA;    rolling_avg&#xA;FROM (&#xA;    SELECT&#xA;        child_id,&#xA;        child_name,&#xA;        behavior_date,&#xA;        AVG(score) OVER (&#xA;            PARTITION BY child_id&#xA;            ORDER BY behavior_date&#xA;            ROWS BETWEEN 6 PRECEDING AND CURRENT ROW&#xA;        ) AS rolling_avg&#xA;    FROM behavior_logs&#xA;)&#xA;WHERE behavior_date &gt;= &#39;2025-12-07&#39;&#xA;  AND rolling_avg &lt; 0&#xA;ORDER BY behavior_date, child_name;&#xA;```&#xA;&#xA;So, that is again the same stuff just with window function.&#xA;&#xA;We can even do this with that subquery wrapped in a CTE (it just looks and reads good, nothing different really, other than you can use the CTE in the same query multiple times, here we don&#39;t need to do it)&#xA;&#xA;### Using CTE and Window Function&#xA;&#xA;We just take the above sub-query and wrap it in a CTE.&#xA;&#xA;```sql&#xA;WITH rolling AS (&#xA;    SELECT&#xA;        child_id,&#xA;        child_name,&#xA;        behavior_date,&#xA;        AVG(score) OVER (&#xA;            PARTITION BY child_id&#xA;            ORDER BY behavior_date&#xA;            ROWS BETWEEN 6 PRECEDING AND CURRENT ROW&#xA;        ) AS rolling_avg&#xA;    FROM behavior_logs&#xA;)&#xA;SELECT&#xA;    child_id,&#xA;    child_name,&#xA;    behavior_date,&#xA;    rolling_avg&#xA;FROM rolling&#xA;WHERE behavior_date &gt;= &#39;2025-12-07&#39;&#xA;  AND rolling_avg &lt; 0&#xA;ORDER BY behavior_date, child_name;&#xA;&#xA;```&#xA;&#xA;The rest remains the same, we just reference the `rolling` as the CTE and grab the necessary details in the query using that CTE.&#xA;&#xA;That should be it for the Day 11!&#xA;&#xA;Some cool CTE, Window function and pratical use case for computing rolling averages, loved it!&#xA;&#xA;On to the day 12!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 10: Misdelivered Presents</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-10</link>
      <description>Advent of SQL, Day 10 - Misdelivered Presents It&#39;s already day 10? We just need 5 more days now! Whoa! that flew by swiftly. Let&#39;s pull in the data. This is the</description>
      <pubDate>Thu, 25 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL, Day 10 - Misdelivered Presents&#xA;&#xA;It&#39;s already day 10? We just need 5 more days now! Whoa! that flew by swiftly.&#xA;&#xA;Let&#39;s pull in the data.&#xA;&#xA;This is the SQL for day 10 in SQLite.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS misdelivered_presents;&#xA;DROP TABLE IF EXISTS deliveries;&#xA;&#xA;CREATE TABLE deliveries (&#xA;    id INT PRIMARY KEY,&#xA;    child_name TEXT,&#xA;    delivery_location TEXT,&#xA;    gift_name TEXT,&#xA;    scheduled_at TIMESTAMP&#xA;);&#xA;&#xA;CREATE TABLE misdelivered_presents (&#xA;    id INT PRIMARY KEY,&#xA;    child_name TEXT,&#xA;    delivery_location TEXT,&#xA;    gift_name TEXT,&#xA;    scheduled_at TIMESTAMP,&#xA;    flagged_at TIMESTAMP,&#xA;    reason TEXT&#xA;);&#xA;&#xA;INSERT INTO deliveries (id, child_name, delivery_location, gift_name, scheduled_at) VALUES&#xA;    (1, &#39;Omar Q.&#39;, &#39;45 Maple Street&#39;, &#39;storybook collection&#39;, &#39;2025-12-24 21:09:00&#39;),&#xA;    (2, &#39;Sofia K.&#39;, &#39;77 Snowflake Road&#39;, &#39;plush reindeer&#39;, &#39;2025-12-24 18:35:00&#39;),&#xA;    (3, &#39;Mila N.&#39;, &#39;The Vibes&#39;, &#39;storybook collection&#39;, &#39;2025-12-24 21:09:00&#39;),&#xA;    (4, &#39;Elias M.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;board game&#39;, &#39;2025-12-24 20:31:00&#39;),&#xA;    (5, &#39;Ravi P.&#39;, &#39;45 Maple Street&#39;, &#39;wooden train set&#39;, &#39;2025-12-24 18:23:00&#39;),&#xA;    (6, &#39;Jonah W.&#39;, &#39;77 Snowflake Road&#39;, &#39;plush reindeer&#39;, &#39;2025-12-24 20:34:00&#39;),&#xA;    (7, &#39;Ava J.&#39;, &#39;123 Evergreen Lane&#39;, &#39;board game&#39;, &#39;2025-12-24 21:03:00&#39;),&#xA;    (8, &#39;Omar Q.&#39;, &#39;77 Snowflake Road&#39;, &#39;board game&#39;, &#39;2025-12-24 18:56:00&#39;),&#xA;    (9, &#39;Nia G.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;teddy bear&#39;, &#39;2025-12-24 21:27:00&#39;),&#xA;    (10, &#39;Zara S.&#39;, &#39;North Pole Annex&#39;, &#39;wooden train set&#39;, &#39;2025-12-24 20:58:00&#39;),&#xA;    (11, &#39;Ravi P.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;puzzle box&#39;, &#39;2025-12-24 18:39:00&#39;),&#xA;    (12, &#39;Jonah W.&#39;, &#39;123 Evergreen Lane&#39;, &#39;puzzle box&#39;, &#39;2025-12-24 18:23:00&#39;),&#xA;    (13, &#39;Ravi P.&#39;, &#39;North Pole Annex&#39;, &#39;storybook collection&#39;, &#39;2025-12-24 21:36:00&#39;),&#xA;    (14, &#39;Lena F.&#39;, &#39;North Pole Annex&#39;, &#39;teddy bear&#39;, &#39;2025-12-24 21:26:00&#39;),&#xA;    (15, &#39;Ava J.&#39;, &#39;North Pole Annex&#39;, &#39;snow globe&#39;, &#39;2025-12-24 18:31:00&#39;),&#xA;    (16, &#39;Elias M.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;robot toy&#39;, &#39;2025-12-24 20:21:00&#39;),&#xA;    (17, &#39;Sofia K.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;teddy bear&#39;, &#39;2025-12-24 20:27:00&#39;),&#xA;    (18, &#39;Jonah W.&#39;, &#39;77 Snowflake Road&#39;, &#39;storybook collection&#39;, &#39;2025-12-24 20:49:00&#39;),&#xA;    (19, &#39;Jonah W.&#39;, &#39;Frost Hollow Cabin&#39;, &#39;art supplies&#39;, &#39;2025-12-24 21:38:00&#39;),&#xA;    (20, &#39;Jonah W.&#39;, &#39;123 Evergreen Lane&#39;, &#39;storybook collection&#39;, &#39;2025-12-24 19:11:00&#39;);&#xA;&#xA;INSERT INTO misdelivered_presents&#xA;(id, child_name, delivery_location, gift_name, scheduled_at, flagged_at, reason)&#xA;VALUES&#xA;    (601, &#39;Priya D.&#39;, &#39;The Vibes&#39;, &#39;plush reindeer&#39;, &#39;2025-12-24 14:00:00&#39;, &#39;2025-12-24 14:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (602, &#39;Lena F.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;board game&#39;, &#39;2025-12-22 06:00:00&#39;, &#39;2025-12-22 06:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (603, &#39;Caleb O.&#39;, &#39;Drifting Igloo&#39;, &#39;board game&#39;, &#39;2025-12-24 06:00:00&#39;, &#39;2025-12-24 06:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (604, &#39;Mateo C.&#39;, &#39;The Vibes&#39;, &#39;art supplies&#39;, &#39;2025-12-22 04:00:00&#39;, &#39;2025-12-22 04:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (605, &#39;Hiro T.&#39;, &#39;The Vibes&#39;, &#39;robot toy&#39;, &#39;2025-12-24 08:00:00&#39;, &#39;2025-12-24 08:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (606, &#39;Priya D.&#39;, &#39;Volcano Rim&#39;, &#39;puzzle box&#39;, &#39;2025-12-22 08:00:00&#39;, &#39;2025-12-22 08:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (607, &#39;Nia G.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;board game&#39;, &#39;2025-12-24 01:00:00&#39;, &#39;2025-12-24 01:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (608, &#39;Elias M.&#39;, &#39;Drifting Igloo&#39;, &#39;board game&#39;, &#39;2025-12-24 01:00:00&#39;, &#39;2025-12-24 01:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (609, &#39;Ravi P.&#39;, &#39;Volcano Rim&#39;, &#39;board game&#39;, &#39;2025-12-24 02:00:00&#39;, &#39;2025-12-24 02:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (610, &#39;Hiro T.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;science kit&#39;, &#39;2025-12-23 20:00:00&#39;, &#39;2025-12-23 20:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (611, &#39;Priya D.&#39;, &#39;Drifting Igloo&#39;, &#39;puzzle box&#39;, &#39;2025-12-22 21:00:00&#39;, &#39;2025-12-22 21:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (612, &#39;Hiro T.&#39;, &#39;Volcano Rim&#39;, &#39;art supplies&#39;, &#39;2025-12-23 09:00:00&#39;, &#39;2025-12-23 09:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (613, &#39;Jonah W.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;board game&#39;, &#39;2025-12-24 01:00:00&#39;, &#39;2025-12-24 01:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (614, &#39;Omar Q.&#39;, &#39;Volcano Rim&#39;, &#39;art supplies&#39;, &#39;2025-12-22 01:00:00&#39;, &#39;2025-12-22 01:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (615, &#39;Omar Q.&#39;, &#39;Drifting Igloo&#39;, &#39;science kit&#39;, &#39;2025-12-23 20:00:00&#39;, &#39;2025-12-23 20:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (616, &#39;Omar Q.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;teddy bear&#39;, &#39;2025-12-24 12:00:00&#39;, &#39;2025-12-24 12:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (617, &#39;Zara S.&#39;, &#39;Volcano Rim&#39;, &#39;wooden train set&#39;, &#39;2025-12-24 12:00:00&#39;, &#39;2025-12-24 12:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (618, &#39;Omar Q.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;teddy bear&#39;, &#39;2025-12-23 15:00:00&#39;, &#39;2025-12-23 15:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (619, &#39;Caleb O.&#39;, &#39;The Vibes&#39;, &#39;teddy bear&#39;, &#39;2025-12-24 14:00:00&#39;, &#39;2025-12-24 14:05:00&#39;, &#39;Invalid delivery location&#39;),&#xA;    (620, &#39;Nia G.&#39;, &#39;Abandoned Lighthouse&#39;, &#39;board game&#39;, &#39;2025-12-23 03:00:00&#39;, &#39;2025-12-23 03:05:00&#39;, &#39;Invalid delivery location&#39;);&#xA;```&#xA;&#xA;```sql&#xA;SELECT * FROM deliveries;&#xA;SELECT * FROM misdelivered_presents;&#xA;```&#xA;&#xA;We have two tables. Almost the same with critical logical distinction among them and one extra column.&#xA;&#xA;Let&#39;s see the problem to check it.&#xA;&#xA;&#xA;## Problem&#xA;&#xA;&gt; Clean-up the deliveries table to remove any records where the delivery_location is &#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;.&#xA;&gt; &#xA;&gt; Move those records to the misdelivered_presents with all the same columns as deliveries plus a flagged_at column with the current time and a reason column with &#34;Invalid delivery location&#34; listed as the reason for each moved record.&#xA;&gt; &#xA;&gt; Make sure your final step shows the misdelivered_presents records that you just moved (i.e. don&#39;t include any existing records from the misdelivered_presents table).&#xA;&#xA;Ok, this looks some easy problem.&#xA;&#xA;- SELECT some data&#xA;- INSERT that data into other table&#xA;- DELETE that data from the original table&#xA;- SELECT the newly inserted data in the other table&#xA;&#xA;Right?&#xA;&#xA;Unless!&#xA;&#xA;&gt; Santa turned to you.&#xA;&gt;&#xA;&gt; “I don’t want this done in five steps,” he said. “And I don’t want any re-selecting. Move the problem presents out of the delivery system, log them in the vault, and show me exactly what you moved.”&#xA;&#xA;Ouch Santa! Don&#39;t be lazy! Be smart he says! Huhh!&#xA;&#xA;&#xA;Ok, at least let&#39;s check both the tables, how many rows they have, and the rows that we want to move around.&#xA;&#xA;```sql&#xA;SELECT COUNT(*) AS delivery_count FROM deliveries;&#xA;SELECT COUNT(*) AS misdelivered_present_count FROM misdelivered_presents;&#xA;&#xA;SELECT &#xA;    COUNT(*) AS misdelivered_deliveries_count&#xA;FROM deliveries &#xA;WHERE &#xA;    delivery_location IN (&#xA;        &#39;Volcano Rim&#39;,&#xA;        &#39;Drifting Igloo&#39;,&#xA;        &#39;Abandoned Lighthouse&#39;,&#xA;        &#39;The Vibes&#39;&#xA;    );&#xA;&#xA;SELECT &#xA;    COUNT(*) AS misdelivered_present_count&#xA;FROM misdelivered_presents&#xA;WHERE &#xA;    delivery_location IN (&#xA;        &#39;Volcano Rim&#39;,&#xA;        &#39;Drifting Igloo&#39;,&#xA;        &#39;Abandoned Lighthouse&#39;,&#xA;        &#39;The Vibes&#39;&#xA;    );&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT COUNT(*) FROM deliveries;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 600      |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM misdelivered_presents;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 50       |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM deliveries WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;);&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 103      |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM misdelivered_presents WHERE delivery_location  IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;);&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 50       |&#xA;+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;OK, so we need to delete the `103` records from the `deliveries` table and move them into the `misdelivered_presents` at the same time.&#xA;&#xA;How are we going to do that in SQLite.&#xA;&#xA;We can try&#xA;- DELETE FROM deliveries with a SELECT from deliveries &#xA;- Then INSERT the deleted data into misdelivered_presents&#xA;&#xA;Could that work?&#xA;&#xA;Let&#39;s see&#xA;&#xA;```sql&#xA;WITH moved AS (&#xA;    DELETE &#xA;        FROM deliveries&#xA;        WHERE delivery_location IN (&#xA;            &#39;Volcano Rim&#39;,&#xA;            &#39;Drifting Igloo&#39;,&#xA;            &#39;Abandoned Lighthouse&#39;,&#xA;            &#39;The Vibes&#39;&#xA;        )&#xA;    RETURNING &#xA;        id, &#xA;        child_name, &#xA;        delivery_location, &#xA;        gift_name, &#xA;        scheduled_at, &#xA;        datetime(&#39;now&#39;) AS flagged_at, &#xA;        &#39;Invalid delivery location&#39; AS reason&#xA;)&#xA;SELECT * FROM moved;&#xA;```&#xA;&#xA;```&#xA;&#xA;```&#xA;&#xA;Ops! Can&#39;t generate a CTE with delete in it.&#xA;&#xA;That&#39;s nasty. Thought that could shove it in the insert into the misdelivered table.&#xA;&#xA;But it should be other way then?&#xA;&#xA;Insert first and then use the data to delete?&#xA;&#xA;```sql&#xA;WITH inserted_data AS (&#xA;  INSERT INTO misdelivered_presents (id, child_name, delivery_location, gift_name, scheduled_at)&#xA;  SELECT id, child_name, delivery_location, gift_name, scheduled_at&#xA;  FROM deliveries&#xA;  WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;)&#xA;DELETE FROM deliveries&#xA;WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;);&#xA;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; WITH inserted_data AS (&#xA;  INSERT INTO misdelivered_presents (id, child_name, delivery_location, gift_name, scheduled_at)&#xA;  SELECT id, child_name, delivery_location, gift_name, scheduled_at&#xA;  FROM deliveries&#xA;  WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;)&#xA;DELETE FROM deliveries&#xA;WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;);&#xA;Parse error: near &#34;INSERT&#34;: syntax error&#xA;  rt the selected rows into misdelivered_presents   INSERT INTO misdelivered_pre&#xA;                                      error here ---^&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Tried it, didn&#39;t work&#xA;&#xA;Quoting the documentation here.&#xA;&#xA;&gt; Common Table Expressions or CTEs act like temporary views that exist only for the duration of a single SQL statement. There are two kinds of common table expressions: &#34;ordinary&#34; and &#34;recursive&#34;. Ordinary common table expressions are helpful for making queries easier to understand by factoring subqueries out of the main SQL statement. Recursive common table expressions provide the ability to do hierarchical or recursive queries of trees and graphs, a capability that is not otherwise available in the SQL language.&#xA;&#xA;I think it doesn&#39;t support CTEs with delete or insert! Sigh!&#xA;&#xA;Now?&#xA;&#xA;BEGIN COMMIT? Atomic Transactions?&#xA;&#xA;Yeah!&#xA;&#xA;Santa wanted it in one go right? That&#39;s not possible in SQLite but at least everything will happen or nothing will.&#xA;&#xA;```sql&#xA;BEGIN;&#xA;&#xA;WITH misdelivered_deliveries AS (&#xA;    SELECT * FROM deliveries &#xA;    WHERE delivery_location IN (&#xA;        &#39;Volcano Rim&#39;, &#xA;        &#39;Drifting Igloo&#39;, &#xA;        &#39;Abandoned Lighthouse&#39;, &#xA;        &#39;The Vibes&#39;)&#xA;)&#xA;INSERT INTO misdelivered_presents (&#xA;    id, &#xA;    child_name,&#xA;    delivery_location, &#xA;    gift_name, &#xA;    scheduled_at, &#xA;    flagged_at, &#xA;    reason&#xA;)&#xA;SELECT &#xA;    id,&#xA;    child_name,&#xA;    delivery_location,&#xA;    gift_name,&#xA;    scheduled_at,&#xA;    DATETIME(&#39;now&#39;),&#xA;    &#39;Invalid delivery location&#39;&#xA;FROM misdelivered_deliveries&#xA;RETURNING *;&#xA;&#xA;DELETE FROM deliveries &#xA;WHERE id IN (&#xA;    SELECT id FROM deliveries &#xA;    WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;);&#xA;&#xA;COMMIT;&#xA;```&#xA;&#xA;So here, we defined the CTE that selects data from `deliveries` and uses it to insert into the `misdelivered_deliveries` table. Then a separate query to delete that data from the `deliveries` table.&#xA;&#xA;Yeah! I mean I don&#39;t think there could be other way to it!&#xA;&#xA;We might or can use triggers to insert into one table when inserted in one table. But I think that is too much of a farfetched solution. We might create a trigger and instantly delete it as it could populate unwanted data if kept in the database.&#xA;&#xA;### Trigger to insert when deleted&#xA;&#xA;We can create a `TRIGGER` to insert into `misdelivered_presents` when something is deleted from `deliveries` table. We will separately have to delete the  records from the `deliveries` table. But the insert will happen automatically after the deletion.&#xA;&#xA;Opening a fresh instance of the database!&#xA;&#xA;```sql&#xA;CREATE TRIGGER move_misdelivered_presents&#xA;BEFORE DELETE ON deliveries&#xA;FOR EACH ROW&#xA;WHEN OLD.delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;BEGIN&#xA;    INSERT INTO misdelivered_presents (&#xA;        id, child_name, delivery_location, gift_name, &#xA;        scheduled_at, flagged_at, reason&#xA;    )&#xA;    VALUES (&#xA;        OLD.id, OLD.child_name, OLD.delivery_location, OLD.gift_name, &#xA;        OLD.scheduled_at, DATETIME(&#39;now&#39;), &#39;Invalid delivery location&#39;&#xA;    );&#xA;END;&#xA;```&#xA;This will create the trigger to insert the row into `misdelivered_presents` table when deleted from the `deliveries` table.&#xA;&#xA;```sql&#xA;DELETE FROM deliveries &#xA;WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;RETURNING *;&#xA;```&#xA;&#xA;Ops! We are returning from the DELETE statement which is wrong as the problem stated to select from the `misdelivered_presents` table.&#xA;&#xA;&gt; Make sure your final step shows the misdelivered_presents records that you just moved (i.e. don&#39;t include any existing records from the misdelivered_presents table).&#xA;&#xA;This is invalid then!&#xA;&#xA;&#xA;Though its not technically atomic. It happens before the delete, so it can mess up things.&#xA;&#xA;&#xA;&#xA;```&#xA;sqlite&gt; .read day10-inserts.sql&#xA;sqlite&gt; .mode table &#xA;sqlite&gt; CREATE TRIGGER move_misdelivered_presents&#xA;BEFORE DELETE ON deliveries&#xA;FOR EACH ROW&#xA;WHEN OLD.delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;BEGIN&#xA;    INSERT INTO misdelivered_presents (&#xA;        id, child_name, delivery_location, gift_name, &#xA;        scheduled_at, flagged_at, reason&#xA;    )&#xA;    VALUES (&#xA;        OLD.id, OLD.child_name, OLD.delivery_location, OLD.gift_name, &#xA;        OLD.scheduled_at, DATETIME(&#39;now&#39;), &#39;Invalid delivery location&#39;&#xA;    );&#xA;END;&#xA;sqlite&gt; DELETE FROM deliveries &#xA;WHERE delivery_location IN (&#39;Volcano Rim&#39;, &#39;Drifting Igloo&#39;, &#39;Abandoned Lighthouse&#39;, &#39;The Vibes&#39;)&#xA;RETURNING *;&#xA;+-----+------------+----------------------+----------------------+---------------------+&#xA;| id  | child_name |  delivery_location   |      gift_name       |    scheduled_at     |&#xA;+-----+------------+----------------------+----------------------+---------------------+&#xA;| 3   | Mila N.    | The Vibes            | storybook collection | 2025-12-24 21:09:00 |&#xA;| 22  | Lena F.    | Abandoned Lighthouse | plush reindeer       | 2025-12-24 19:08:00 |&#xA;| 23  | Mila N.    | Abandoned Lighthouse | storybook collection | 2025-12-24 20:42:00 |&#xA;| 29  | Mateo C.   | Volcano Rim          | plush reindeer       | 2025-12-24 21:44:00 |&#xA;| 31  | Nia G.     | Drifting Igloo       | robot toy            | 2025-12-24 19:57:00 |&#xA;...&#xA;...&#xA;| 582 | Zara S.    | The Vibes            | teddy bear           | 2025-12-24 21:20:00 |&#xA;| 585 | Layla B.   | Abandoned Lighthouse | wooden train set     | 2025-12-24 18:39:00 |&#xA;| 587 | Nia G.     | Volcano Rim          | storybook collection | 2025-12-24 18:35:00 |&#xA;| 596 | Omar Q.    | Abandoned Lighthouse | puzzle box           | 2025-12-24 19:28:00 |&#xA;+-----+------------+----------------------+----------------------+---------------------+&#xA;sqlite&gt; DROP TRIGGER move_misdelivered_presents;&#xA;sqlite&gt; SELECT COUNT(*) FROM deliveries;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 497      |&#xA;+----------+&#xA;sqlite&gt; SELECT COUNT(*) FROM misdelivered_presents;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 153      |&#xA;+----------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;So, it gets the result but I don&#39;t think the `TRIGGER` is a solution to this.&#xA;&#xA;In SQLite, Atomic transaction using begin and end is the only thing to go for right?&#xA;&#xA;Some one prove Santa that it can&#39;t be done in SQLite in one query? Please!&#xA;&#xA;That&#39;s it from day 10, I have spent enough time on this, banging my head on the sqlite shell! &#xA;&#xA;Off to day 11 tomorrow!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 9: Evergreen Market Orders</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-9</link>
      <description>Advent of SQL, Day 9 - Evergreen Market Orders We are on day 9 of advent of SQL, and I feel good so far. Let&#39;s see what we learn today? Let&#39;s get the inserts fo</description>
      <pubDate>Wed, 24 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL, Day 9 - Evergreen Market Orders&#xA;&#xA;We are on day 9 of advent of SQL, and I feel good so far.&#xA;&#xA;Let&#39;s see what we learn today?&#xA;&#xA;Let&#39;s get the inserts for the day.&#xA;&#xA;```&#xA;sqlite&gt; .read day9-inserts.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE orders (&#xA;    id           INT PRIMARY KEY,&#xA;    customer_id  INT,&#xA;    created_at   TIMESTAMP,&#xA;    order_data   JSONB&#xA;);&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; SELECT * FROM orders limit 10;&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| id | customer_id |     created_at      |                          order_data                          |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 1  | 1           | 2025-11-21 13:08:22 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: tru |&#xA;|    |             |                     | e}}                                                          |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 2  | 1           | 2025-11-21 18:42:58 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;high |&#xA;|    |             |                     | &#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}                              |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 3  | 1           | 2025-11-21 21:01:46 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: fal |&#xA;|    |             |                     | se}}                                                         |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 4  | 1           | 2025-11-24 13:17:27 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: tru |&#xA;|    |             |                     | e}}                                                          |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 5  | 1           | 2025-11-24 21:09:46 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: fa |&#xA;|    |             |                     | lse}}                                                        |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 6  | 1           | 2025-11-25 07:24:55 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;mediu |&#xA;|    |             |                     | m&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}                              |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 7  | 1           | 2025-11-25 17:42:36 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: fal |&#xA;|    |             |                     | se}}                                                         |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 8  | 1           | 2025-11-27 02:34:24 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true |&#xA;|    |             |                     | }}                                                           |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 9  | 1           | 2025-11-30 22:43:54 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: tr |&#xA;|    |             |                     | ue}}                                                         |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;| 10 | 1           | 2025-12-01 04:03:33 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;medium |&#xA;|    |             |                     | &#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}                              |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Looks like we will deal with JSON today, seems exciting. I haven&#39;t dealt with JSON in SQLite yet, today will break it.&#xA;&#xA;Let&#39;s get some sample inserts for you to play in the browser.&#xA;&#xA;LIMITING TO 20, there are more than 400 rows!&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS orders;&#xA;&#xA;CREATE TABLE orders (&#xA;    id           INT PRIMARY KEY,&#xA;    customer_id  INT,&#xA;    created_at   TIMESTAMP,&#xA;    order_data   JSONB&#xA;);&#xA;&#xA;INSERT INTO orders (id, customer_id, created_at, order_data) VALUES&#xA;    (1, 1, &#39;2025-11-21 13:08:22&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (2, 1, &#39;2025-11-21 18:42:58&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;high&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (3, 1, &#39;2025-11-21 21:01:46&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (4, 1, &#39;2025-11-24 13:17:27&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (5, 1, &#39;2025-11-24 21:09:46&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (6, 1, &#39;2025-11-25 07:24:55&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;medium&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (7, 1, &#39;2025-11-25 17:42:36&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (8, 1, &#39;2025-11-27 02:34:24&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (9, 1, &#39;2025-11-30 22:43:54&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (10, 1, &#39;2025-12-01 04:03:33&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;medium&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (11, 1, &#39;2025-12-02 05:19:10&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;low&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (12, 1, &#39;2025-12-03 16:25:56&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;medium&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (13, 1, &#39;2025-12-10 19:34:28&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (14, 1, &#39;2025-12-16 19:23:53&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (15, 2, &#39;2025-11-23 19:11:23&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (16, 2, &#39;2025-11-28 15:23:27&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (17, 2, &#39;2025-11-30 12:05:36&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;low&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: true}}&#39;),&#xA;    (18, 2, &#39;2025-12-03 07:03:06&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (19, 2, &#39;2025-12-07 13:55:13&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;express&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;high&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;),&#xA;    (20, 2, &#39;2025-12-08 07:17:31&#39;, &#39;{&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}&#39;);&#xA;```&#xA;&#xA;```sql&#xA;SELECT * FROM orders;&#xA;```&#xA;&#xA;Let&#39;s get to the problem now.&#xA;&#xA;## Problem&#xA;&#xA;&gt; Build a report using the orders table that shows the latest order for each customer, along with their requested shipping method, gift wrap choice (as true or false), and the risk flag in separate columns.&#xA;&gt; &#xA;&gt; Order the report by the most recent order first so Evergreen Market can reach out to them ASAP.&#xA;&#xA;Ok, so we need for each customer the latest order with the following details:&#xA;- shipping method&#xA;- gift wrap choice&#xA;- risk flag&#xA;&#xA;These all I think are in the same column as a JSON string or blob. We need to extract those out from the column.&#xA;&#xA;Let&#39;s first check the `orders` table.&#xA;&#xA;It has a few columns:&#xA;- id&#xA;- customer_id&#xA;- created_at&#xA;- order_data&#xA;&#xA;We do require the `order_data` as that is the column that has JSON.&#xA;&#xA;Also the problem said to give the most recent order, so we need to order by `created_at` in a reverse way, the latest first. Also we need it per customer, so we need to group by `customer_id`.&#xA;&#xA;Let&#39;s see how to get the data inside JSON in SQLite.&#xA;&#xA;### JSON Extract&#xA;&#xA;Well we have [json_extract](https://sqlite.org/json1.html#jex) which can give us the value of the key from the given json data string.&#xA;&#xA;The function takes in the column name from the table which would be the column containing the json data and then the second parameter is the path to the key, in this case if we want to get the `method` from the `shipping` key, we can use `$.shipping.method` which means from the root `$` get the `shipping` key, and inside that (`shipping` key) give the value inside the `method` key.&#xA;&#xA;If the path is not present, in our case the `risk`  key is very rarely present in the actual original json data, the function skips the further key lookup, it returns `NULL`.&#xA;&#xA;```sql&#xA;SELECT json_extract(orders.order_data, &#39;$.shipping.method&#39;) FROM orders LIMIT 5;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; select json_extract(orders.order_data, &#39;$.shipping.method&#39;) FROM orders LIMIT 5;&#xA;+------------------------------------------------------+&#xA;| json_extract(orders.order_data, &#39;$.shipping.method&#39;) |&#xA;+------------------------------------------------------+&#xA;| standard                                             |&#xA;| overnight                                            |&#xA;| standard                                             |&#xA;| standard                                             |&#xA;| overnight                                            |&#xA;+------------------------------------------------------+&#xA;sqlite&gt; select *, json_extract(orders.order_data, &#39;$.shipping.method&#39;) FROM orders LIMIT 5;&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| id | customer_id |     created_at      |                          order_data                          | json_extract(orders.order_data, &#39;$.shipping.method&#39;) |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| 1  | 1           | 2025-11-21 13:08:22 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: tru | standard                                             |&#xA;|    |             |                     | e}}                                                          |                                                      |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| 2  | 1           | 2025-11-21 18:42:58 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;risk&#34;: {&#34;flag&#34;: &#34;high | overnight                                            |&#xA;|    |             |                     | &#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: false}}                              |                                                      |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| 3  | 1           | 2025-11-21 21:01:46 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: fal | standard                                             |&#xA;|    |             |                     | se}}                                                         |                                                      |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| 4  | 1           | 2025-11-24 13:17:27 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;standard&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: tru | standard                                             |&#xA;|    |             |                     | e}}                                                          |                                                      |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;| 5  | 1           | 2025-11-24 21:09:46 | {&#34;shipping&#34;: {&#34;method&#34;: &#34;overnight&#34;}, &#34;gift&#34;: {&#34;wrapped&#34;: fa | overnight                                            |&#xA;|    |             |                     | lse}}                                                        |                                                      |&#xA;+----+-------------+---------------------+--------------------------------------------------------------+------------------------------------------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Let&#39;s grab the other two.&#xA;&#xA;```sql&#xA;SELECT&#xA;    json_extract(order_data, &#39;$.shipping.method&#39;),&#xA;    json_extract(order_data, &#39;$.gift.wrapped&#39;),&#xA;    json_extract(order_data, &#39;$.risk.flag&#39;) &#xA;FROM orders;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; select json_extract(order_data, &#39;$.shipping.method&#39;), json_extract(order_data, &#39;$.gift.wrapped&#39;), json_extract(order_data, &#39;$.risk.flag&#39;) FROM orders LIMIT 5;&#xA;+-----------------------------------------------+--------------------------------------------+-----------------------------------------+&#xA;| json_extract(order_data, &#39;$.shipping.method&#39;) | json_extract(order_data, &#39;$.gift.wrapped&#39;) | json_extract(order_data, &#39;$.risk.flag&#39;) |&#xA;+-----------------------------------------------+--------------------------------------------+-----------------------------------------+&#xA;| standard                                      | 1                                          |                                         |&#xA;| overnight                                     | 0                                          | high                                    |&#xA;| standard                                      | 0                                          |                                         |&#xA;| standard                                      | 1                                          |                                         |&#xA;| overnight                                     | 0                                          |                                         |&#xA;+-----------------------------------------------+--------------------------------------------+-----------------------------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Hmm, interesting, it automatically converted `false` to `0`, and `true` to `1` as you can see in the actual json data it was `true` and `false`. Nice, but could go the other way as well, if you were checking for true or false instead of 0 and 1. Both are same in SQLite, they are treated as truth and false values.&#xA;&#xA;We have all the things we need, now?&#xA;&#xA;Well, let&#39;s group by the `customer_id` and order by the `created_at` to get the info about each customer with its latest order.&#xA;&#xA;```sql&#xA;SELECT *, json_extract(order_data, &#39;$.shipping.method&#39;), json_extract(order_data, &#39;$.gift.wrapped&#39;), json_extract(order_data, &#39;$.risk.flag&#39;) FROM orders GROUP BY customer_id ORDER by created_at LIMIT 5;&#xA;&#xA;```&#xA;&#xA;But there I see the problem, how can we group by customer_id and find the latest order for that customer! The order by will only work after the grouping has been done, so its not necessarily the per order for each customer.&#xA;&#xA;Hmm!&#xA;&#xA;We need to make a subquery to get the latest order for each customer it seems.&#xA;&#xA;```sql&#xA;SELECT &#xA;    orders.customer_id,&#xA;    orders.created_at,&#xA;    json_extract(orders.order_data, &#39;$.shipping.method&#39;) AS shipping_method,&#xA;    json_extract(orders.order_data, &#39;$.gift.wrapped&#39;) AS gift_wrap,&#xA;    json_extract(orders.order_data, &#39;$.risk.flag&#39;) AS risk_flag&#xA;FROM orders&#xA;WHERE orders.created_at = (&#xA;    SELECT MAX(created_at)&#xA;    FROM orders AS latest_order&#xA;    WHERE orders.customer_id = latest_order.customer_id&#xA;)&#xA;ORDER BY orders.created_at DESC;&#xA;&#xA;```&#xA;We just added this &#xA;&#xA;```sql&#xA;WHERE orders.created_at = (&#xA;    SELECT MAX(created_at)&#xA;    FROM orders AS latest_order&#xA;    WHERE orders.customer_id = latest_order.customer_id&#xA;)&#xA;```&#xA;This will filter only the latest created order for each customer and then we can grab the details from json in each row per customer after this isolates the row with the latest created at time.&#xA;&#xA;```&#xA;sqlite&gt; SELECT &#xA;    orders.customer_id,&#xA;    orders.created_at,&#xA;    json_extract(orders.order_data, &#39;$.shipping.method&#39;) AS shipping_method,&#xA;    json_extract(orders.order_data, &#39;$.gift.wrapped&#39;) AS gift_wrap,&#xA;    json_extract(orders.order_data, &#39;$.risk.flag&#39;) AS risk_flag&#xA;FROM orders&#xA;WHERE orders.created_at = (&#xA;    SELECT MAX(created_at)&#xA;    FROM orders AS latest_order&#xA;    WHERE orders.customer_id = latest_order.customer_id&#xA;)&#xA;ORDER BY orders.created_at DESC;&#xA;&#xA;+-------------+---------------------+-----------------+-----------+-----------+&#xA;| customer_id |     created_at      | shipping_method | gift_wrap | risk_flag |&#xA;+-------------+---------------------+-----------------+-----------+-----------+&#xA;| 32          | 2025-12-17 21:17:39 | overnight       | 0         |           |&#xA;| 15          | 2025-12-17 19:21:33 | express         | 0         | medium    |&#xA;| 50          | 2025-12-17 14:47:54 | express         | 1         | low       |&#xA;| 43          | 2025-12-17 14:23:46 | express         | 1         |           |&#xA;| 27          | 2025-12-17 14:05:13 | standard        | 1         |           |&#xA;| 3           | 2025-12-17 14:02:28 | standard        | 1         | high      |&#xA;| 49          | 2025-12-17 13:28:49 | express         | 1         | high      |&#xA;| 36          | 2025-12-17 11:11:29 | overnight       | 1         |           |&#xA;| 31          | 2025-12-17 08:05:46 | express         | 0         |           |&#xA;| 16          | 2025-12-17 07:32:36 | express         | 0         |           |&#xA;| 38          | 2025-12-17 06:05:12 | standard        | 1         |           |&#xA;| 44          | 2025-12-17 05:28:54 | standard        | 1         |           |&#xA;| 9           | 2025-12-17 04:33:08 | express         | 1         |           |&#xA;| 23          | 2025-12-17 03:01:49 | express         | 0         |           |&#xA;| 21          | 2025-12-16 23:53:14 | overnight       | 1         |           |&#xA;| 25          | 2025-12-16 20:49:58 | overnight       | 1         | high      |&#xA;| 46          | 2025-12-16 19:38:37 | standard        | 0         |           |&#xA;| 1           | 2025-12-16 19:23:53 | express         | 0         |           |&#xA;| 28          | 2025-12-16 18:20:55 | standard        | 0         | low       |&#xA;| 40          | 2025-12-16 17:54:05 | express         | 0         |           |&#xA;| 13          | 2025-12-16 16:11:16 | standard        | 1         |           |&#xA;| 24          | 2025-12-16 14:19:45 | overnight       | 0         |           |&#xA;| 11          | 2025-12-16 11:20:31 | standard        | 1         | medium    |&#xA;| 17          | 2025-12-16 08:19:36 | standard        | 0         |           |&#xA;| 4           | 2025-12-16 04:38:51 | express         | 0         |           |&#xA;| 34          | 2025-12-16 02:11:57 | express         | 0         |           |&#xA;| 30          | 2025-12-15 15:32:04 | overnight       | 0         | medium    |&#xA;| 48          | 2025-12-15 13:03:59 | standard        | 1         |           |&#xA;| 41          | 2025-12-15 13:00:00 | standard        | 0         | high      |&#xA;| 45          | 2025-12-15 11:37:57 | standard        | 0         |           |&#xA;| 7           | 2025-12-14 23:39:47 | express         | 0         |           |&#xA;| 35          | 2025-12-14 22:46:36 | express         | 1         | high      |&#xA;| 47          | 2025-12-14 20:53:07 | standard        | 0         |           |&#xA;| 22          | 2025-12-14 12:38:58 | standard        | 0         | medium    |&#xA;| 12          | 2025-12-14 07:59:28 | standard        | 1         | medium    |&#xA;| 18          | 2025-12-14 04:55:34 | overnight       | 0         | low       |&#xA;| 20          | 2025-12-14 04:54:07 | overnight       | 0         |           |&#xA;| 14          | 2025-12-13 07:44:19 | standard        | 1         |           |&#xA;| 6           | 2025-12-13 07:03:12 | overnight       | 1         |           |&#xA;| 10          | 2025-12-13 04:23:37 | standard        | 0         | medium    |&#xA;| 19          | 2025-12-13 03:29:15 | standard        | 0         |           |&#xA;| 8           | 2025-12-12 12:42:18 | overnight       | 0         |           |&#xA;| 26          | 2025-12-11 17:35:46 | standard        | 0         | low       |&#xA;| 37          | 2025-12-11 13:55:35 | overnight       | 1         |           |&#xA;| 33          | 2025-12-09 12:30:54 | express         | 1         |           |&#xA;| 2           | 2025-12-08 07:17:31 | standard        | 0         |           |&#xA;| 42          | 2025-12-08 02:48:12 | overnight       | 0         | medium    |&#xA;| 5           | 2025-12-06 17:53:53 | overnight       | 1         |           |&#xA;| 39          | 2025-12-06 14:38:29 | overnight       | 1         |           |&#xA;| 29          | 2025-12-03 05:10:32 | overnight       | 1         | high      |&#xA;+-------------+---------------------+-----------------+-----------+-----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Now, that is it! I guess, a quick solution for getting this data sorted.&#xA;&#xA;However there is one more way without needing the subquery.&#xA;&#xA;### ROW NUMBER - Window Function&#xA;&#xA;We can create partition for each customer with the `customer_id` and then sort by the `created_at` date in a latest first order (descending) and simply take the first row for extraction. This is kind of the same thing, but a kind of &#39;gives a good feel to me&#39; solution. Elite Mindset!&#xA;&#xA;```sql&#xA;WITH ranked_orders AS (&#xA;    SELECT &#xA;        orders.customer_id,&#xA;        orders.created_at,&#xA;        json_extract(orders.order_data, &#39;$.shipping.method&#39;) AS shipping_method,&#xA;        json_extract(orders.order_data, &#39;$.gift.wrapped&#39;) AS gift_wrap,&#xA;        json_extract(orders.order_data, &#39;$.risk.flag&#39;) AS risk_flag,&#xA;        ROW_NUMBER() OVER (PARTITION BY orders.customer_id ORDER BY orders.created_at DESC) AS row_num&#xA;    FROM orders&#xA;)&#xA;SELECT &#xA;    customer_id,&#xA;    created_at,&#xA;    shipping_method,&#xA;    gift_wrap,&#xA;    risk_flag&#xA;FROM ranked_orders&#xA;WHERE row_num = 1&#xA;ORDER BY created_at DESC;&#xA;```&#xA;&#xA;We are just doing the same thing as explained.&#xA;&#xA;This thing&#xA;&#xA;`ROW_NUMBER() OVER (PARTITION BY orders.customer_id ORDER BY orders.created_at DESC) AS row_num`&#xA;&#xA;It will partition the table `orders` with `customer_id` and order by `created_at` latest first. And we assign each row as the `row_number` and grab the 1st row per customer to grab the latest order (filter with WHERE when calling the cte, or querying. &#xA;&#xA;&#xA;However we created the CTE `ranked_orders` or you can call it `latest_orders`, to get the `row_num` filter to only the first `1` and for that we grab the already extracted json data as well the other columns.&#xA;&#xA;Simple!&#xA;&#xA;That is it!&#xA;&#xA;Day 9 was easy peasy!&#xA;&#xA;Onwards day 10! Catch you tomorrow!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 8: Product Catalog</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-8</link>
      <description>Advent of SQL - Day 8, Product Catalog Whopsies! This is day 8. Let&#39;s get straigh... HOOH! We need to clean up some SQL for running in SQLite. Just cleaning up</description>
      <pubDate>Tue, 23 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL - Day 8, Product Catalog&#xA;&#xA;Whopsies! This is day 8.&#xA;&#xA;Let&#39;s get straigh...&#xA;&#xA;HOOH! We need to clean up some SQL for running in SQLite.&#xA;&#xA;```bash&#xA;sed -i &#39;s/TIMESTAMP[[:space:]]*//g&#39; day8-inserts-sqlite.sql&#xA;```&#xA;&#xA;Just cleaning up `TIMESTAMP` in `INSERT` before the date value.&#xA;&#xA;Here we go:&#xA;The SQL to run in SQLite.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS price_changes;&#xA;DROP TABLE IF EXISTS products;&#xA;&#xA;CREATE TABLE products (&#xA;    product_id INT PRIMARY KEY,&#xA;    product_name TEXT&#xA;);&#xA;&#xA;CREATE TABLE price_changes (&#xA;    id INT PRIMARY KEY,&#xA;    product_id INT,&#xA;    price NUMERIC(10,2),&#xA;    effective_timestamp &#xA;);&#xA;&#xA;INSERT INTO products (product_id, product_name) VALUES&#xA;    (1, &#39;Deluxe Sled&#39;),&#xA;    (2, &#39;Holiday Trail Mix Trio&#39;),&#xA;    (3, &#39;Premium Cinnamon Roasted Almonds&#39;),&#xA;    (4, &#39;Deluxe Wrapping Paper&#39;),&#xA;    (5, &#39;Deluxe Roasted Cashews&#39;),&#xA;    (6, &#39;Festive Cookware Set&#39;),&#xA;    (7, &#39;Deluxe Mug&#39;),&#xA;    (8, &#39;Premium Sled&#39;),&#xA;    (9, &#39;Essential Sled&#39;),&#xA;    (10, &#39;Family Snow Boots&#39;),&#xA;    (11, &#39;Family Dark Chocolate Almonds&#39;),&#xA;    (12, &#39;Premium Festive Scarf&#39;),&#xA;    (13, &#39;Essential Cookie Decorating Kit&#39;),&#xA;    (14, &#39;Festive White Chocolate Popcorn&#39;),&#xA;    (15, &#39;Cozy Puzzle&#39;),&#xA;    (16, &#39;Holiday Cheddar Popcorn&#39;),&#xA;    (17, &#39;Premium Board Game&#39;),&#xA;    (18, &#39;Deluxe Pecan Praline Bites&#39;),&#xA;    (19, &#39;Cozy Almond Brittle&#39;),&#xA;    (20, &#39;Winter Sled&#39;);&#xA;&#xA;INSERT INTO price_changes (id, product_id, price, effective_timestamp) VALUES&#xA;    (1, 1, 148.28, &#39;2025-12-01 05:25:35&#39;),&#xA;    (2, 1, 148.63, &#39;2025-12-02 18:15:33&#39;),&#xA;    (3, 1, 126.78, &#39;2025-12-02 18:40:38&#39;),&#xA;    (4, 1, 119.12, &#39;2025-12-03 10:14:35&#39;),&#xA;    (5, 1, 98.57, &#39;2025-12-04 04:14:31&#39;),&#xA;    (6, 1, 88.49, &#39;2025-12-06 19:02:40&#39;),&#xA;    (7, 1, 80.88, &#39;2025-12-07 10:43:54&#39;),&#xA;    (8, 1, 78.88, &#39;2025-12-08 06:45:39&#39;),&#xA;    (9, 1, 80.24, &#39;2025-12-08 16:11:11&#39;),&#xA;    (10, 1, 73.9, &#39;2025-12-10 14:33:43&#39;),&#xA;    (11, 1, 88.2, &#39;2025-12-12 02:21:09&#39;),&#xA;    (12, 1, 99.03, &#39;2025-12-12 02:58:14&#39;),&#xA;    (13, 1, 100.18, &#39;2025-12-14 15:58:03&#39;),&#xA;    (14, 1, 106.91, &#39;2025-12-16 01:51:05&#39;),&#xA;    (15, 1, 109.25, &#39;2025-12-16 16:01:53&#39;),&#xA;    (16, 2, 29.54, &#39;2025-12-03 14:21:10&#39;),&#xA;    (17, 2, 34.33, &#39;2025-12-03 19:14:31&#39;),&#xA;    (18, 2, 39.08, &#39;2025-12-04 06:13:48&#39;),&#xA;    (19, 2, 32.71, &#39;2025-12-04 18:33:17&#39;),&#xA;    (20, 2, 31.71, &#39;2025-12-05 22:36:14&#39;),&#xA;    (21, 2, 28.88, &#39;2025-12-06 02:42:02&#39;),&#xA;    (22, 2, 23.14, &#39;2025-12-07 09:46:54&#39;),&#xA;    (23, 2, 25.65, &#39;2025-12-07 10:03:45&#39;),&#xA;    (24, 2, 27.06, &#39;2025-12-07 14:39:15&#39;),&#xA;    (25, 2, 24.48, &#39;2025-12-07 20:08:05&#39;),&#xA;    (26, 2, 26.84, &#39;2025-12-09 07:44:32&#39;),&#xA;    (27, 2, 27.39, &#39;2025-12-13 06:25:19&#39;),&#xA;    (28, 2, 26.6, &#39;2025-12-14 10:16:34&#39;),&#xA;    (29, 2, 21.38, &#39;2025-12-15 16:20:20&#39;),&#xA;    (30, 2, 17.75, &#39;2025-12-16 09:28:13&#39;);&#xA;```&#xA;&#xA;We can get started.&#xA;&#xA;## Problem&#xA;&#xA;&gt; Generate a report, using the products and `price_changes` tables for leadership that returns the `product_name`, `current_price`, `previous_price`, and the difference between the current and previous prices&#xA;&#xA;&#xA;So what we need is &#xA;&#xA;- product_name&#xA;- current_price (latest)&#xA;- previous_price (just before the latest)&#xA;- price_difference = current - previous&#xA;&#xA;Again we have to meddle with dates, maybe, maybe not!&#xA;&#xA;### With CTEs and JOINs&#xA;&#xA;Let&#39;s start with the simplest approach. We need 2 prices, the latest(highest date timestamp) and the 2nd latest (the 2nd highest date timestamp). We can get the first pretty easily, but what about the second?&#xA;&#xA;Well, if we get the first, then the second should be easy to get right? Right? Because it will be just before it. Well not directly.&#xA;&#xA;Let&#39;s first grab the max timestamp.&#xA;&#xA;```sql&#xA;SELECT &#xA;    product_id,&#xA;    MAX(effective_timestamp) AS latest_timestamp&#xA;FROM price_changes&#xA;GROUP BY product_id;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT product_id, MAX(effective_timestamp) AS max_ts&#xA;        FROM price_changes&#xA;        GROUP BY product_id;&#xA;+------------+---------------------+&#xA;| product_id |       max_ts        |&#xA;+------------+---------------------+&#xA;| 1          | 2025-12-16 16:01:53 |&#xA;| 2          | 2025-12-16 09:28:13 |&#xA;| 3          | 2025-12-15 03:20:11 |&#xA;| 4          | 2025-12-16 01:33:41 |&#xA;| 5          | 2025-12-12 10:11:48 |&#xA;| 6          | 2025-12-15 11:31:40 |&#xA;| 7          | 2025-12-16 03:00:51 |&#xA;| 8          | 2025-12-15 22:33:48 |&#xA;| 9          | 2025-12-15 20:05:34 |&#xA;| 10         | 2025-12-15 20:53:45 |&#xA;...&#xA;...&#xA;| 139        | 2025-12-16 04:46:33 |&#xA;| 140        | 2025-12-16 21:19:30 |&#xA;| 141        | 2025-12-15 09:50:36 |&#xA;| 142        | 2025-12-16 18:39:51 |&#xA;| 143        | 2025-12-15 07:27:06 |&#xA;| 144        | 2025-12-16 16:25:16 |&#xA;| 146        | 2025-12-13 07:07:19 |&#xA;| 148        | 2025-12-16 09:30:11 |&#xA;| 149        | 2025-12-13 16:40:21 |&#xA;| 150        | 2025-12-13 08:24:43 |&#xA;+------------+---------------------+&#xA;```&#xA;&#xA;We got all the timestamp&#39;s for each product. But we wanted the prices. Well we can&#39;t grab the price here, since we are grouping!&#xA;&#xA;We can use the timestamp as we are selecting the `MAX` aggregate function however among the many rows for single product how do we group the price? `MIN`, `MAX`, `AVG`, but that&#39;s not we want, we just want the price for that timestamp.&#xA;&#xA;Well, we need to join to the same table for that timestamp and grab the price.&#xA;&#xA;```sql&#xA;SELECT &#xA;    price_changes.product_id,&#xA;    price_changes.price AS current_price,&#xA;    price_changes.effective_timestamp AS latest_ts&#xA;FROM price_changes&#xA;JOIN (&#xA;    SELECT &#xA;        product_id, &#xA;        MAX(effective_timestamp) AS latest_timestamp&#xA;    FROM price_changes&#xA;    GROUP BY product_id&#xA;) AS latest_price_change&#xA;  ON price_changes.product_id = latest_price_change.product_id&#xA; AND price_changes.effective_timestamp = latest_price_change.latest_timestamp;&#xA;```&#xA;&#xA;Here, we first specify what we want&#xA;- `product_id`&#xA;- `current_price` which is the price for the &#xA;- `latest_timestamp` which is the latest time recorded for the product price.&#xA;&#xA;We are grouping this by `product_id` since there are price recorded and various timestamps. We need to get the latest timestamp.&#xA;So, we do a nested query to join the latest timestamp and join the price with the same timestamp.&#xA; &#xA;This condition `price_changes.effective_timestamp = latest_price_change.latest_timestamp` helps us get the `price` for the `latest_timestamp`. We first get each timestamp for the product and then find its max, so that&#39;s the inner query from which we joined this table to. Self join with the different thing to find.&#xA;&#xA;This gives us 2 things.&#xA;- Product id&#xA;- Price for the latest timestamp &#xA; &#xA;We don&#39;t really want the timestamp in the final result, its just a criteria or a intermediate value to get the current and the previous prices for each product.&#xA;&#xA;```&#xA;sqlite&gt; &#xA;sqlite&gt; SELECT &#xA;    price_changes.product_id,&#xA;    price_changes.price AS current_price,&#xA;    price_changes.effective_timestamp AS latest_ts&#xA;FROM price_changes&#xA;JOIN (&#xA;    SELECT &#xA;        product_id, &#xA;        MAX(effective_timestamp) AS latest_timestamp&#xA;    FROM price_changes&#xA;    GROUP BY product_id&#xA;) AS latest_price_change&#xA;  ON price_changes.product_id = latest_price_change.product_id&#xA; AND price_changes.effective_timestamp = latest_price_change.latest_timestamp;&#xA;+------------+---------------+---------------------+&#xA;| product_id | current_price |      latest_ts      |&#xA;+------------+---------------+---------------------+&#xA;| 1          | 109.25        | 2025-12-16 16:01:53 |&#xA;| 2          | 17.75         | 2025-12-16 09:28:13 |&#xA;| 3          | 143.65        | 2025-12-15 03:20:11 |&#xA;| 4          | 98.51         | 2025-12-16 01:33:41 |&#xA;| 5          | 124.04        | 2025-12-12 10:11:48 |&#xA;| 6          | 84.14         | 2025-12-15 11:31:40 |&#xA;| 7          | 123.09        | 2025-12-16 03:00:51 |&#xA;| 8          | 221.06        | 2025-12-15 22:33:48 |&#xA;| 9          | 255.88        | 2025-12-15 20:05:34 |&#xA;| 10         | 57.99         | 2025-12-15 20:53:45 |&#xA;...&#xA;...&#xA;| 139        | 16.41         | 2025-12-16 04:46:33 |&#xA;| 140        | 173.05        | 2025-12-16 21:19:30 |&#xA;| 141        | 69.97         | 2025-12-15 09:50:36 |&#xA;| 142        | 35.05         | 2025-12-16 18:39:51 |&#xA;| 143        | 153.94        | 2025-12-15 07:27:06 |&#xA;| 144        | 118.21        | 2025-12-16 16:25:16 |&#xA;| 146        | 54.73         | 2025-12-13 07:07:19 |&#xA;| 148        | 107.81        | 2025-12-16 09:30:11 |&#xA;| 149        | 72.6          | 2025-12-13 16:40:21 |&#xA;| 150        | 138.66        | 2025-12-13 08:24:43 |&#xA;+------------+---------------+---------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Now, we need to the price before this max timestamp. How do we get it?&#xA;&#xA;We need to again join? Yes...&#xA;&#xA;We need to subquery the timestamp just before it. But how will we get the max timestamp for each product. Well that&#39;s what we wrote above.&#xA;&#xA;We can convert that to a CTE.&#xA;&#xA;```sql&#xA;&#xA;WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;)&#xA;SELECT * FROM latest;&#xA;```&#xA;&#xA;We just got the same thing, however we can now use `latest` as a temporary table in the query.&#xA;&#xA;```sql&#xA;WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes&#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT * FROM previous JOIN latest ON previous.product_id = latest.product_id;&#xA;```&#xA;&#xA;&#xA;So, ok, this is getting long.&#xA;&#xA;Just we added this &#xA;&#xA;```sql&#xA;SELECT &#xA;    price_changes.product_id,&#xA;    price_changes.price AS previous_price,&#xA;    MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;FROM price_changes&#xA;JOIN latest&#xA;    ON price_changes.product_id = latest.product_id&#xA;WHERE &#xA;    price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;GROUP BY price_changes.product_id&#xA;```&#xA;&#xA;This won&#39;t work as we need the latest table which we converted to CTE.&#xA;&#xA;So, to get the 2nd highest or latest timestamp for each product, we do this.&#xA;&#xA;```sql&#xA;WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;) &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id;&#xA;```&#xA;&#xA;We use:&#xA; &#xA;```sql&#xA; JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;```&#xA;To get match the timestamp which will exclude the latest timestamp and that way we can again get the `MAX(price_changes.effective_timestamp) AS previous_timestamp` for the subset of the timestamp.&#xA;&#xA;&#xA;&#xA;This gives us all the previous timestamps i.e. the price just before the max timestamp for each product.&#xA;&#xA;```&#xA;sqlite&gt; WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;) &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id;&#xA;+------------+----------------+---------------------+&#xA;| product_id | previous_price | previous_timestamp  |&#xA;+------------+----------------+---------------------+&#xA;| 1          | 106.91         | 2025-12-16 01:51:05 |&#xA;| 2          | 21.38          | 2025-12-15 16:20:20 |&#xA;| 3          | 159.65         | 2025-12-14 09:52:09 |&#xA;| 4          | 105.6          | 2025-12-14 14:09:20 |&#xA;| 5          | 129.23         | 2025-12-12 04:08:45 |&#xA;| 6          | 88.97          | 2025-12-15 02:13:04 |&#xA;| 7          | 127.14         | 2025-12-13 07:25:12 |&#xA;| 8          | 241.99         | 2025-12-14 19:31:40 |&#xA;| 9          | 259.56         | 2025-12-12 12:47:13 |&#xA;| 10         | 64.88          | 2025-12-15 10:52:34 |&#xA;...&#xA;...&#xA;| 140        | 157.02         | 2025-12-09 17:58:07 |&#xA;| 141        | 73.88          | 2025-12-13 14:35:53 |&#xA;| 142        | 30.25          | 2025-12-16 13:44:42 |&#xA;| 143        | 143.04         | 2025-12-11 10:08:13 |&#xA;| 144        | 114.22         | 2025-12-15 17:33:37 |&#xA;| 146        | 65.71          | 2025-12-10 15:50:14 |&#xA;| 148        | 101.09         | 2025-12-15 05:37:27 |&#xA;| 149        | 86.31          | 2025-12-13 13:18:14 |&#xA;| 150        | 123.61         | 2025-12-12 10:49:22 |&#xA;+------------+----------------+---------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Now, we need to join both of these on the same product id, to fetch both previous and current timestamp as well as the price.&#xA;&#xA;```sql&#xA;SELECT * FROM previous JOIN latest ON previous.product_id = latest.product_id;&#xA;```&#xA;&#xA;Simple&#xA;&#xA;```sql&#xA;WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id &#xA;)&#xA;SELECT * FROM previous JOIN latest ON previous.product_id = latest.product_id;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; WITH latest AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT * FROM previous JOIN latest ON previous.product_id = latest.product_id;&#xA;+------------+----------------+---------------------+------------+---------------+---------------------+&#xA;| product_id | previous_price | previous_timestamp  | product_id | current_price |  latest_timestamp   |&#xA;+------------+----------------+---------------------+------------+---------------+---------------------+&#xA;| 1          | 106.91         | 2025-12-16 01:51:05 | 1          | 109.25        | 2025-12-16 16:01:53 |&#xA;| 2          | 21.38          | 2025-12-15 16:20:20 | 2          | 17.75         | 2025-12-16 09:28:13 |&#xA;| 3          | 159.65         | 2025-12-14 09:52:09 | 3          | 143.65        | 2025-12-15 03:20:11 |&#xA;| 4          | 105.6          | 2025-12-14 14:09:20 | 4          | 98.51         | 2025-12-16 01:33:41 |&#xA;| 5          | 129.23         | 2025-12-12 04:08:45 | 5          | 124.04        | 2025-12-12 10:11:48 |&#xA;| 6          | 88.97          | 2025-12-15 02:13:04 | 6          | 84.14         | 2025-12-15 11:31:40 |&#xA;| 7          | 127.14         | 2025-12-13 07:25:12 | 7          | 123.09        | 2025-12-16 03:00:51 |&#xA;| 8          | 241.99         | 2025-12-14 19:31:40 | 8          | 221.06        | 2025-12-15 22:33:48 |&#xA;| 9          | 259.56         | 2025-12-12 12:47:13 | 9          | 255.88        | 2025-12-15 20:05:34 |&#xA;| 10         | 64.88          | 2025-12-15 10:52:34 | 10         | 57.99         | 2025-12-15 20:53:45 |&#xA;...&#xA;...&#xA;| 139        | 15.26          | 2025-12-12 03:43:32 | 139        | 16.41         | 2025-12-16 04:46:33 |&#xA;| 140        | 157.02         | 2025-12-09 17:58:07 | 140        | 173.05        | 2025-12-16 21:19:30 |&#xA;| 141        | 73.88          | 2025-12-13 14:35:53 | 141        | 69.97         | 2025-12-15 09:50:36 |&#xA;| 142        | 30.25          | 2025-12-16 13:44:42 | 142        | 35.05         | 2025-12-16 18:39:51 |&#xA;| 143        | 143.04         | 2025-12-11 10:08:13 | 143        | 153.94        | 2025-12-15 07:27:06 |&#xA;| 144        | 114.22         | 2025-12-15 17:33:37 | 144        | 118.21        | 2025-12-16 16:25:16 |&#xA;| 146        | 65.71          | 2025-12-10 15:50:14 | 146        | 54.73         | 2025-12-13 07:07:19 |&#xA;| 148        | 101.09         | 2025-12-15 05:37:27 | 148        | 107.81        | 2025-12-16 09:30:11 |&#xA;| 149        | 86.31          | 2025-12-13 13:18:14 | 149        | 72.6          | 2025-12-13 16:40:21 |&#xA;| 150        | 123.61         | 2025-12-12 10:49:22 | 150        | 138.66        | 2025-12-13 08:24:43 |&#xA;+------------+----------------+---------------------+------------+---------------+---------------------+&#xA;sqlite&gt; &#xA;&#xA;&#xA;```&#xA;&#xA;Now, we are getting somewhere, we just need to find the difference right?&#xA;&#xA;Yes, but more JOINs&#xA;&#xA;We need the product_name from the `product` table. Almost forgot that table exists?&#xA;&#xA;```sql&#xA;SELECT &#xA;    *&#xA;FROM &#xA;    products &#xA;JOIN latest &#xA;    ON products.product_id = latest.product_id &#xA;LEFT JOIN previous &#xA;    ON products.product_id = previous.product_id;&#xA;```&#xA;&#xA;We need both right? So, we need to fetch the product id and join for the latest and the previous table from the CTE, joining on the `product_id`.&#xA;&#xA;&#xA;```sql&#xA;WITH latest AS (                                        &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT &#xA;    *&#xA;FROM &#xA;    products &#xA;JOIN latest &#xA;    ON products.product_id = latest.product_id &#xA;LEFT JOIN previous &#xA;    ON products.product_id = previous.product_id;&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH latest AS (                                        &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT * FROM products JOIN latest ON products.product_id = latest.product_id &#xA;   ...&gt; LEFT JOIN previous ON products.product_id = previous.product_id;&#xA;+------------+-----------------------------------+------------+---------------+---------------------+------------+----------------+---------------------+&#xA;| product_id |           product_name            | product_id | current_price |  latest_timestamp   | product_id | previous_price | previous_timestamp  |&#xA;+------------+-----------------------------------+------------+---------------+---------------------+------------+----------------+---------------------+&#xA;| 1          | Deluxe Sled                       | 1          | 109.25        | 2025-12-16 16:01:53 | 1          | 106.91         | 2025-12-16 01:51:05 |&#xA;| 2          | Holiday Trail Mix Trio            | 2          | 17.75         | 2025-12-16 09:28:13 | 2          | 21.38          | 2025-12-15 16:20:20 |&#xA;| 3          | Premium Cinnamon Roasted Almonds  | 3          | 143.65        | 2025-12-15 03:20:11 | 3          | 159.65         | 2025-12-14 09:52:09 |&#xA;| 4          | Deluxe Wrapping Paper             | 4          | 98.51         | 2025-12-16 01:33:41 | 4          | 105.6          | 2025-12-14 14:09:20 |&#xA;| 5          | Deluxe Roasted Cashews            | 5          | 124.04        | 2025-12-12 10:11:48 | 5          | 129.23         | 2025-12-12 04:08:45 |&#xA;| 6          | Festive Cookware Set              | 6          | 84.14         | 2025-12-15 11:31:40 | 6          | 88.97          | 2025-12-15 02:13:04 |&#xA;| 7          | Deluxe Mug                        | 7          | 123.09        | 2025-12-16 03:00:51 | 7          | 127.14         | 2025-12-13 07:25:12 |&#xA;| 8          | Premium Sled                      | 8          | 221.06        | 2025-12-15 22:33:48 | 8          | 241.99         | 2025-12-14 19:31:40 |&#xA;| 9          | Essential Sled                    | 9          | 255.88        | 2025-12-15 20:05:34 | 9          | 259.56         | 2025-12-12 12:47:13 |&#xA;| 10         | Family Snow Boots                 | 10         | 57.99         | 2025-12-15 20:53:45 | 10         | 64.88          | 2025-12-15 10:52:34 |&#xA;...&#xA;...&#xA;| 140        | Classic Mug                       | 140        | 173.05        | 2025-12-16 21:19:30 | 140        | 157.02         | 2025-12-09 17:58:07 |&#xA;| 141        | Family Fruit Assortment           | 141        | 69.97         | 2025-12-15 09:50:36 | 141        | 73.88          | 2025-12-13 14:35:53 |&#xA;| 142        | Classic Ornament                  | 142        | 35.05         | 2025-12-16 18:39:51 | 142        | 30.25          | 2025-12-16 13:44:42 |&#xA;| 143        | Essential Ornament                | 143        | 153.94        | 2025-12-15 07:27:06 | 143        | 143.04         | 2025-12-11 10:08:13 |&#xA;| 144        | Premium Trail Mix Trio            | 144        | 118.21        | 2025-12-16 16:25:16 | 144        | 114.22         | 2025-12-15 17:33:37 |&#xA;| 146        | Premium Book Collection           | 146        | 54.73         | 2025-12-13 07:07:19 | 146        | 65.71          | 2025-12-10 15:50:14 |&#xA;| 148        | Cozy Trail Mix Trio               | 148        | 107.81        | 2025-12-16 09:30:11 | 148        | 101.09         | 2025-12-15 05:37:27 |&#xA;| 149        | Family Cheddar Popcorn            | 149        | 72.6          | 2025-12-13 16:40:21 | 149        | 86.31          | 2025-12-13 13:18:14 |&#xA;| 150        | Holiday Headphones                | 150        | 138.66        | 2025-12-13 08:24:43 | 150        | 123.61         | 2025-12-12 10:49:22 |&#xA;+------------+-----------------------------------+------------+---------------+---------------------+------------+----------------+---------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Now, we just want&#xA;- product_name&#xA;- previous_price&#xA;- current_price&#xA;- difference of current_price and previous price&#xA;&#xA;So,&#xA;&#xA;```sql&#xA;WITH latest AS (                                        &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT &#xA;    products.product_name, &#xA;    latest.current_price, &#xA;    previous.previous_price, &#xA;    (latest.current_price - previous.previous_price) as price_difference&#xA;FROM &#xA;    products &#xA;JOIN latest &#xA;    ON products.product_id = latest.product_id &#xA;LEFT JOIN previous &#xA;    ON products.product_id = previous.product_id;&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH latest AS (                                        &#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS current_price,&#xA;           price_changes.effective_timestamp AS latest_timestamp&#xA;    FROM price_changes&#xA;    JOIN (                                                   &#xA;        SELECT product_id, MAX(effective_timestamp) AS max_timestamp&#xA;        FROM price_changes&#xA;        GROUP BY product_id&#xA;    ) latest_price_changes                             &#xA;      ON price_changes.product_id = latest_price_changes.product_id&#xA;     AND price_changes.effective_timestamp = latest_price_changes.max_timestamp&#xA;),  previous AS (&#xA;    SELECT price_changes.product_id,&#xA;           price_changes.price AS previous_price,&#xA;           MAX(price_changes.effective_timestamp) AS previous_timestamp&#xA;    FROM price_changes&#xA;    JOIN latest&#xA;      ON price_changes.product_id = latest.product_id&#xA;    WHERE price_changes.effective_timestamp &lt; latest.latest_timestamp&#xA;    GROUP BY price_changes.product_id&#xA;)&#xA;SELECT &#xA;    products.product_name, latest.current_price, previous.previous_price, (latest.current_price - previous.previous_price) as price_difference&#xA;FROM &#xA;    products &#xA;JOIN latest &#xA;    ON products.product_id = latest.product_id &#xA;LEFT JOIN previous &#xA;    ON products.product_id = previous.product_id;&#xA;&#xA;+-----------------------------------+---------------+----------------+-------------------+&#xA;|           product_name            | current_price | previous_price | price_difference  |&#xA;+-----------------------------------+---------------+----------------+-------------------+&#xA;| Deluxe Sled                       | 109.25        | 106.91         | 2.34              |&#xA;| Holiday Trail Mix Trio            | 17.75         | 21.38          | -3.63             |&#xA;| Premium Cinnamon Roasted Almonds  | 143.65        | 159.65         | -16.0             |&#xA;| Deluxe Wrapping Paper             | 98.51         | 105.6          | -7.08999999999999 |&#xA;| Deluxe Roasted Cashews            | 124.04        | 129.23         | -5.18999999999998 |&#xA;| Festive Cookware Set              | 84.14         | 88.97          | -4.83             |&#xA;| Deluxe Mug                        | 123.09        | 127.14         | -4.05             |&#xA;| Premium Sled                      | 221.06        | 241.99         | -20.93            |&#xA;| Essential Sled                    | 255.88        | 259.56         | -3.68000000000001 |&#xA;...&#xA;...&#xA;| Classic Ornament                  | 35.05         | 30.25          | 4.8               |&#xA;| Essential Ornament                | 153.94        | 143.04         | 10.9              |&#xA;| Premium Trail Mix Trio            | 118.21        | 114.22         | 3.98999999999999  |&#xA;| Premium Book Collection           | 54.73         | 65.71          | -10.98            |&#xA;| Cozy Trail Mix Trio               | 107.81        | 101.09         | 6.72              |&#xA;| Family Cheddar Popcorn            | 72.6          | 86.31          | -13.71            |&#xA;| Holiday Headphones                | 138.66        | 123.61         | 15.05             |&#xA;+-----------------------------------+---------------+----------------+-------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;&#xA;Oh ok! That is it!&#xA;&#xA;CTEs and some nasty number of JOINs.&#xA;&#xA;Can we do it some other ways?&#xA;&#xA;Of course we can, and we will&#xA;&#xA;&#xA;### ROW NUMBER - Window Function&#xA;&#xA;We can even use some window functions because we are doing things per product so we can leverage [ROW_NUMBER](https://sqlite.org/windowfunctions.html#:~:text=in%20window%20functions%3A-,row_number(),-The%20number%20of) in our case.&#xA;&#xA;&gt; ROW_NUMBER: The number of the row within the current partition. Rows are numbered starting from 1 in the order defined by the ORDER BY clause in the window definition, or in arbitrary order otherwise.&#xA;&#xA;So, we can partition by `product_id` in the `price_changes` table and order by the timestamp (latest first i.e. descending) and grab the first two prices.&#xA;&#xA;```sql&#xA;WITH ranked_prices AS (&#xA;    SELECT price_changes.*,&#xA;           ROW_NUMBER() OVER (&#xA;               PARTITION BY price_changes.product_id&#xA;               ORDER BY price_changes.effective_timestamp DESC&#xA;           ) AS row_number&#xA;    FROM price_changes&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM ranked_prices;&#xA;```&#xA;&#xA;Here, we just partitioned the price table based on the latest to oldest timestamp (descending) and added the other column `row_number` that will be further used to get the `previous` and the `current` price and time.&#xA;&#xA;```&#xA;sqlite&gt; WITH ranked_prices AS (&#xA;    SELECT price_changes.*,&#xA;           ROW_NUMBER() OVER (&#xA;               PARTITION BY price_changes.product_id&#xA;               ORDER BY price_changes.effective_timestamp DESC&#xA;           ) AS row_number&#xA;    FROM price_changes&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM ranked_prices;&#xA;+------+------------+--------+---------------------+------------+&#xA;|  id  | product_id | price  | effective_timestamp | row_number |&#xA;+------+------------+--------+---------------------+------------+&#xA;| 15   | 1          | 109.25 | 2025-12-16 16:01:53 | 1          |&#xA;| 14   | 1          | 106.91 | 2025-12-16 01:51:05 | 2          |&#xA;| 13   | 1          | 100.18 | 2025-12-14 15:58:03 | 3          |&#xA;| 12   | 1          | 99.03  | 2025-12-12 02:58:14 | 4          |&#xA;| 11   | 1          | 88.2   | 2025-12-12 02:21:09 | 5          |&#xA;| 10   | 1          | 73.9   | 2025-12-10 14:33:43 | 6          |&#xA;| 9    | 1          | 80.24  | 2025-12-08 16:11:11 | 7          |&#xA;| 8    | 1          | 78.88  | 2025-12-08 06:45:39 | 8          |&#xA;| 7    | 1          | 80.88  | 2025-12-07 10:43:54 | 9          |&#xA;| 6    | 1          | 88.49  | 2025-12-06 19:02:40 | 10         |&#xA;| 5    | 1          | 98.57  | 2025-12-04 04:14:31 | 11         |&#xA;| 4    | 1          | 119.12 | 2025-12-03 10:14:35 | 12         |&#xA;| 3    | 1          | 126.78 | 2025-12-02 18:40:38 | 13         |&#xA;| 2    | 1          | 148.63 | 2025-12-02 18:15:33 | 14         |&#xA;| 1    | 1          | 148.28 | 2025-12-01 05:25:35 | 15         |&#xA;...&#xA;...&#xA;| 1260 | 149        | 167.1  | 2025-12-01 06:36:43 | 15         |&#xA;| 1288 | 150        | 138.66 | 2025-12-13 08:24:43 | 1          |&#xA;| 1287 | 150        | 123.61 | 2025-12-12 10:49:22 | 2          |&#xA;| 1286 | 150        | 141.8  | 2025-12-10 07:06:06 | 3          |&#xA;| 1285 | 150        | 122.16 | 2025-12-09 20:01:54 | 4          |&#xA;| 1284 | 150        | 122.06 | 2025-12-06 22:27:41 | 5          |&#xA;| 1283 | 150        | 128.6  | 2025-12-06 13:03:18 | 6          |&#xA;| 1282 | 150        | 154.72 | 2025-12-05 08:15:08 | 7          |&#xA;| 1281 | 150        | 170.3  | 2025-12-04 15:25:50 | 8          |&#xA;| 1280 | 150        | 156.51 | 2025-12-03 19:11:12 | 9          |&#xA;| 1279 | 150        | 161.93 | 2025-12-02 02:47:10 | 10         |&#xA;| 1278 | 150        | 174.36 | 2025-12-02 01:39:14 | 11         |&#xA;| 1277 | 150        | 180.17 | 2025-12-01 20:36:02 | 12         |&#xA;| 1276 | 150        | 164.35 | 2025-12-01 07:09:29 | 13         |&#xA;| 1275 | 150        | 141    | 2025-12-01 01:29:46 | 14         |&#xA;+------+------------+--------+---------------------+------------+&#xA;&#xA;```&#xA;&#xA;&#xA;&#xA;```sql&#xA;WITH ranked_prices AS (&#xA;    SELECT price_changes.*,&#xA;           ROW_NUMBER() OVER (&#xA;               PARTITION BY price_changes.product_id&#xA;               ORDER BY price_changes.effective_timestamp DESC&#xA;           ) AS row_number&#xA;    FROM price_changes&#xA;)&#xA;SELECT&#xA;    products.product_name,&#xA;    current_price.price AS current_price,&#xA;    previous_price.price AS previous_price,&#xA;    current_price.price - previous_price.price AS price_difference&#xA;FROM products&#xA;LEFT JOIN ranked_prices AS current_price&#xA;       ON products.product_id = current_price.product_id&#xA;      AND current_price.row_number = 1&#xA;LEFT JOIN ranked_prices AS previous_price&#xA;       ON products.product_id = previous_price.product_id&#xA;      AND previous_price.row_number = 2;&#xA;```&#xA;This is pretty simple, we just get&#xA;- the `current_price` as `row_number = 1` from the `ranked_prices` CTE&#xA;- the `previous_price` as `row_number = 2` from the `ranked_prices` CTE&#xA;&#xA;We got all we needed as:&#xA;- `products.product_name`&#xA;- `current_price.price AS current_price`&#xA;- `previous_price.price AS previous_price`&#xA;- `current_price.price - previous_price.price AS price_difference`&#xA;&#xA;So, this is how `ROW_NUMBER` can be used here.&#xA;&#xA;&#xA;### With LEAD LAG - Window Functions&#xA;&#xA;We can also use [LEAD](https://sqlite.org/windowfunctions.html#:~:text=does%20not%20exist.-,lead(expr),-lead(expr%2C%20offset) and [LAG](https://sqlite.org/windowfunctions.html#:~:text=a%20part%20of.-,lag(expr),-lag(expr%2C%20offset) window functions here. We can specifically use `LAG` as it was a challenge to get the second latest timestamped price.&#xA;&#xA;We will partition the `price_changes` table on the `product_id` and order it by the `effective_timestamp`. This will give us the row before the timestamp of the current one, so the first row for each product can be empty (if sorted in ascending order of timestamp since there is no before row for the first row)&#xA;&#xA;```sql&#xA; WITH lagged_prices AS (&#xA;    SELECT&#xA;        product_id,&#xA;        price AS current_price,&#xA;        LAG(price) OVER (&#xA;            PARTITION BY product_id&#xA;            ORDER BY effective_timestamp&#xA;        ) AS previous_price,&#xA;        effective_timestamp&#xA;    FROM price_changes&#xA;)&#xA;SELECT * FROM lagged_prices;&#xA;```&#xA;&#xA;Here, we simply selected the data that we need, `product_id`, `price` as the `current_price` since we can order by latest timestamp. &#xA;&#xA;```&#xA;sqlite&gt; WITH lagged_prices AS (&#xA;    SELECT&#xA;        product_id,&#xA;        price AS current_price,&#xA;        LAG(price) OVER (&#xA;            PARTITION BY product_id&#xA;            ORDER BY effective_timestamp&#xA;        ) AS previous_price,&#xA;        effective_timestamp&#xA;    FROM price_changes&#xA;)&#xA;SELECT * FROM lagged_prices;&#xA;+------------+---------------+----------------+---------------------+&#xA;| product_id | current_price | previous_price | effective_timestamp |&#xA;+------------+---------------+----------------+---------------------+&#xA;| 1          | 148.28        |                | 2025-12-01 05:25:35 |&#xA;| 1          | 148.63        | 148.28         | 2025-12-02 18:15:33 |&#xA;| 1          | 126.78        | 148.63         | 2025-12-02 18:40:38 |&#xA;| 1          | 119.12        | 126.78         | 2025-12-03 10:14:35 |&#xA;| 1          | 98.57         | 119.12         | 2025-12-04 04:14:31 |&#xA;| 1          | 88.49         | 98.57          | 2025-12-06 19:02:40 |&#xA;| 1          | 80.88         | 88.49          | 2025-12-07 10:43:54 |&#xA;| 1          | 78.88         | 80.88          | 2025-12-08 06:45:39 |&#xA;| 1          | 80.24         | 78.88          | 2025-12-08 16:11:11 |&#xA;| 1          | 73.9          | 80.24          | 2025-12-10 14:33:43 |&#xA;| 1          | 88.2          | 73.9           | 2025-12-12 02:21:09 |&#xA;| 1          | 99.03         | 88.2           | 2025-12-12 02:58:14 |&#xA;| 1          | 100.18        | 99.03          | 2025-12-14 15:58:03 |&#xA;| 1          | 106.91        | 100.18         | 2025-12-16 01:51:05 |&#xA;| 1          | 109.25        | 106.91         | 2025-12-16 16:01:53 |&#xA;| 2          | 29.54         |                | 2025-12-03 14:21:10 |&#xA;| 2          | 34.33         | 29.54          | 2025-12-03 19:14:31 |&#xA;| 2          | 39.08         | 34.33          | 2025-12-04 06:13:48 |&#xA;| 2          | 32.71         | 39.08          | 2025-12-04 18:33:17 |&#xA;| 2          | 31.71         | 32.71          | 2025-12-05 22:36:14 |&#xA;| 2          | 28.88         | 31.71          | 2025-12-06 02:42:02 |&#xA;...&#xA;...&#xA;| 149        | 101.03        | 98.4           | 2025-12-10 00:37:46 |&#xA;| 149        | 95.68         | 101.03         | 2025-12-13 01:10:53 |&#xA;| 149        | 86.31         | 95.68          | 2025-12-13 13:18:14 |&#xA;| 149        | 72.6          | 86.31          | 2025-12-13 16:40:21 |&#xA;| 150        | 141           |                | 2025-12-01 01:29:46 |&#xA;| 150        | 164.35        | 141            | 2025-12-01 07:09:29 |&#xA;| 150        | 180.17        | 164.35         | 2025-12-01 20:36:02 |&#xA;| 150        | 174.36        | 180.17         | 2025-12-02 01:39:14 |&#xA;| 150        | 161.93        | 174.36         | 2025-12-02 02:47:10 |&#xA;| 150        | 156.51        | 161.93         | 2025-12-03 19:11:12 |&#xA;| 150        | 170.3         | 156.51         | 2025-12-04 15:25:50 |&#xA;| 150        | 154.72        | 170.3          | 2025-12-05 08:15:08 |&#xA;| 150        | 128.6         | 154.72         | 2025-12-06 13:03:18 |&#xA;| 150        | 122.06        | 128.6          | 2025-12-06 22:27:41 |&#xA;| 150        | 122.16        | 122.06         | 2025-12-09 20:01:54 |&#xA;| 150        | 141.8         | 122.16         | 2025-12-10 07:06:06 |&#xA;| 150        | 123.61        | 141.8          | 2025-12-12 10:49:22 |&#xA;| 150        | 138.66        | 123.61         | 2025-12-13 08:24:43 |&#xA;+------------+---------------+----------------+---------------------+&#xA;&#xA;```&#xA;&#xA;&#xA;Then we can join. The conditions are important to filter.&#xA;&#xA;- we get the lagged time stamp where we want the max of the timestamp since the latest timestamp will have the lagging timestamp for it from the CTE of lagged_prices and current_price as well.&#xA;&#xA;&#xA;```sql&#xA;SELECT&#xA;    products.product_name,&#xA;    lagged_prices.current_price,&#xA;    lagged_prices.previous_price,&#xA;    lagged_prices.current_price - lagged_prices.previous_price AS price_difference&#xA;FROM products&#xA;JOIN lagged_prices&#xA;  ON products.product_id = lagged_prices.product_id&#xA;WHERE lagged_prices.effective_timestamp = (&#xA;    SELECT MAX(effective_timestamp)&#xA;    FROM price_changes&#xA;    WHERE product_id = products.product_id&#xA;);&#xA;&#xA;```&#xA;&#xA;&#xA;```sql&#xA;WITH lagged_prices AS (&#xA;    SELECT&#xA;        product_id,&#xA;        price AS current_price,&#xA;        LAG(price) OVER (&#xA;            PARTITION BY product_id&#xA;            ORDER BY effective_timestamp&#xA;        ) AS previous_price,&#xA;        effective_timestamp&#xA;    FROM price_changes&#xA;)&#xA;SELECT&#xA;    products.product_name,&#xA;    lagged_prices.current_price,&#xA;    lagged_prices.previous_price,&#xA;    lagged_prices.current_price - lagged_prices.previous_price AS price_difference&#xA;FROM products&#xA;JOIN lagged_prices&#xA;  ON products.product_id = lagged_prices.product_id&#xA;WHERE lagged_prices.effective_timestamp = (&#xA;    SELECT MAX(effective_timestamp)&#xA;    FROM price_changes&#xA;    WHERE product_id = products.product_id&#xA;);&#xA;&#xA;```&#xA;This yields the same result.&#xA;&#xA;This is all big, any smaller queries?&#xA;&#xA;Not quite small.&#xA;&#xA;### LIMIT OFFSET and CTE&#xA;&#xA;Well here&#39;s a little short, but quite dirty.&#xA;```sql&#xA;SELECT *,&#xA;       current_price - previous_price AS price_difference&#xA;FROM (&#xA;    SELECT&#xA;        products.product_name,&#xA;        (SELECT price&#xA;         FROM price_changes&#xA;         WHERE price_changes.product_id = products.product_id&#xA;         ORDER BY effective_timestamp DESC&#xA;         LIMIT 1) AS current_price,&#xA;        (SELECT price&#xA;         FROM price_changes&#xA;         WHERE price_changes.product_id = products.product_id&#xA;         ORDER BY effective_timestamp DESC&#xA;         LIMIT 1 OFFSET 1) AS previous_price&#xA;    FROM products&#xA;);&#xA;```&#xA;&#xA;This basically grabs the  price from the latest timestamp and the 2nd timestamp with offset and wraps it in a CTE to compute the difference.&#xA;&#xA;Pretty slick if you ask me.&#xA;&#xA;But hey! I am done for this! &#xA;&#xA;It was a great problem.&#xA;&#xA;Getting data behind certain data is quite relatable and challenging enough to explore window functions and what not.&#xA;&#xA;That&#39;s it from day 8.&#xA;&#xA;See you tomorrow for day 9!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 7: Polar Express Mixin</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-7</link>
      <description>Advent of SQL, Day 7 - Polar Express There were a few things, I had to dig up for converting the JSON in the statements into strings for SQLite, we can&#39;t really</description>
      <pubDate>Mon, 22 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL, Day 7 - Polar Express&#xA;&#xA;There were a few things, I had to dig up for converting the JSON `ARRAY[]` in the statements into strings for SQLite, we can&#39;t really use list of strings in SQLite.&#xA;&#xA;Here&#39;s the command to convert that array of strings into string.&#xA;&#xA;```&#xA;sed &#34;s/ARRAY\[&#39;/&#39;\[\&#34;/g; s/&#39;,&#39;/\&#34;,\&#34;/g; s/&#39;]/\&#34;]&#39;/g&#34; day7-inserts.sql &gt; day7-inserts-sqlite.sql&#xA;```&#xA; &#xA;OK, once that&#39;s done, this can be safely run into a sqlite database.&#xA;&#xA;```&#xA;$ sqlite3&#xA;SQLite version 3.45.1 2024-01-30 16:01:20&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day7-inserts-sqlite.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE passengers (&#xA;    passenger_id INT PRIMARY KEY,&#xA;    passenger_name TEXT,&#xA;    favorite_mixins TEXT[],&#xA;    car_id INT&#xA;);&#xA;CREATE TABLE cocoa_cars (&#xA;    car_id INT PRIMARY KEY,&#xA;    available_mixins TEXT[],&#xA;    total_stock INT&#xA;);&#xA;sqlite&gt; &#xA;sqlite&gt; .mode table&#xA;sqlite&gt; select * from passengers limit 20;&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| passenger_id | passenger_name |                       favorite_mixins                        | car_id |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 1            | Ava Johnson    | [&#34;vanilla foam&#34;]                                             | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 2            | Mateo Cruz     | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 3            | Nia Grant      | [&#34;shaved chocolate&#34;]                                         | 5      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 4            | Hiro Tanaka    | [&#34;shaved chocolate&#34;]                                         | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 5            | Layla Brooks   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;cinnamon&#34; | 3      |&#xA;|              |                | ]                                                            |        |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 6            | Ravi Patel     | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 5      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 7            | Sofia Kim      | [&#34;cinnamon&#34;]                                                 | 9      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 8            | Jonah Wolfe    | [&#34;cinnamon&#34;,&#34;dark chocolate&#34;]                                | 7      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 9            | Elena Morales  | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]     | 6      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 10           | Diego Ramos    | [&#34;shaved chocolate&#34;]                                         | 1      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 11           | Zara Sheikh    | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]                  | 4      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 12           | Caleb Osei     | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;white chocolate&#34;]      | 8      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 13           | Mila Novak     | [&#34;crispy rice&#34;,&#34;cinnamon&#34;]                                   | 4      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 14           | Lucas Ford     | [&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;cinnamon&#34;]                | 4      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 15           | Yara Haddad    | [&#34;white chocolate&#34;,&#34;dark chocolate&#34;]                         | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 16           | Omar Qureshi   | [&#34;marshmallow&#34;]                                              | 3      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 17           | Keiko Ito      | [&#34;vanilla foam&#34;,&#34;marshmallow&#34;]                               | 7      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 18           | Tariq Hassan   | [&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;,&#34;peppermin | 2      |&#xA;|              |                | t&#34;]                                                          |        |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 19           | Mira Zhao      | [&#34;caramel drizzle&#34;,&#34;marshmallow&#34;,&#34;cinnamon&#34;]                 | 7      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 20           | Bianca Pereira | [&#34;dark chocolate&#34;,&#34;peppermint&#34;]                              | 5      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;sqlite&gt; &#xA;sqlite&gt; select * from cocoa_cars;&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| car_id |                       available_mixins                       | total_stock |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         |&#xA;|        | ate&#34;]                                                        |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 4      | [&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]                       | 338         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 8      | [&#34;vanilla foam&#34;,&#34;marshmallow&#34;]                               | 263         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 1      | [&#34;peppermint&#34;,&#34;crispy rice&#34;]                                 | 205         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 6      | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;cinnamon | 161         |&#xA;|        | &#34;,&#34;peppermint&#34;]                                              |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 7      | [&#34;caramel drizzle&#34;,&#34;crispy rice&#34;,&#34;marshmallow&#34;,&#34;vanilla foam | 132         |&#xA;|        | &#34;,&#34;cinnamon&#34;]                                                |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 3      | [&#34;vanilla foam&#34;,&#34;peppermint&#34;]                                | 95          |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Here&#39;s your full SQL file:&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS passengers;&#xA;DROP TABLE IF EXISTS cocoa_cars;&#xA;&#xA;CREATE TABLE passengers (&#xA;    passenger_id INT PRIMARY KEY,&#xA;    passenger_name TEXT,&#xA;    favorite_mixins TEXT[],&#xA;    car_id INT&#xA;);&#xA;&#xA;CREATE TABLE cocoa_cars (&#xA;    car_id INT PRIMARY KEY,&#xA;    available_mixins TEXT[],&#xA;    total_stock INT&#xA;);&#xA;&#xA;INSERT INTO passengers (passenger_id, passenger_name, favorite_mixins, car_id) VALUES&#xA;    (1, &#39;Ava Johnson&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 2),&#xA;    (2, &#39;Mateo Cruz&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]&#39;, 2),&#xA;    (3, &#39;Nia Grant&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 5),&#xA;    (4, &#39;Hiro Tanaka&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 2),&#xA;    (5, &#39;Layla Brooks&#39;, &#39;[&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;cinnamon&#34;]&#39;, 3),&#xA;    (6, &#39;Ravi Patel&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]&#39;, 5),&#xA;    (7, &#39;Sofia Kim&#39;, &#39;[&#34;cinnamon&#34;]&#39;, 9),&#xA;    (8, &#39;Jonah Wolfe&#39;, &#39;[&#34;cinnamon&#34;,&#34;dark chocolate&#34;]&#39;, 7),&#xA;    (9, &#39;Elena Morales&#39;, &#39;[&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 6),&#xA;    (10, &#39;Diego Ramos&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 1),&#xA;    (11, &#39;Zara Sheikh&#39;, &#39;[&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]&#39;, 4),&#xA;    (12, &#39;Caleb Osei&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;white chocolate&#34;]&#39;, 8),&#xA;    (13, &#39;Mila Novak&#39;, &#39;[&#34;crispy rice&#34;,&#34;cinnamon&#34;]&#39;, 4),&#xA;    (14, &#39;Lucas Ford&#39;, &#39;[&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;cinnamon&#34;]&#39;, 4),&#xA;    (15, &#39;Yara Haddad&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (16, &#39;Omar Qureshi&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 3),&#xA;    (17, &#39;Keiko Ito&#39;, &#39;[&#34;vanilla foam&#34;,&#34;marshmallow&#34;]&#39;, 7),&#xA;    (18, &#39;Tariq Hassan&#39;, &#39;[&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;,&#34;peppermint&#34;]&#39;, 2),&#xA;    (19, &#39;Mira Zhao&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;marshmallow&#34;,&#34;cinnamon&#34;]&#39;, 7),&#xA;    (20, &#39;Bianca Pereira&#39;, &#39;[&#34;dark chocolate&#34;,&#34;peppermint&#34;]&#39;, 5),&#xA;    (21, &#39;Eva Schmidt&#39;, &#39;[&#34;white chocolate&#34;,&#34;marshmallow&#34;]&#39;, 5),&#xA;    (22, &#39;Rafael Silva&#39;, &#39;[&#34;cinnamon&#34;,&#34;caramel drizzle&#34;]&#39;, 3),&#xA;    (23, &#39;Nolan Murphy&#39;, &#39;[&#34;caramel drizzle&#34;]&#39;, 8),&#xA;    (24, &#39;Sara Johansson&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 6),&#xA;    (25, &#39;Ingrid Nilsen&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;peppermint&#34;,&#34;marshmallow&#34;]&#39;, 2),&#xA;    (26, &#39;Arjun Kapoor&#39;, &#39;[&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (27, &#39;Nova Adams&#39;, &#39;[&#34;cinnamon&#34;,&#34;dark chocolate&#34;]&#39;, 9),&#xA;    (28, &#39;Felix Schneider&#39;, &#39;[&#34;crispy rice&#34;,&#34;vanilla foam&#34;]&#39;, 4),&#xA;    (29, &#39;Tim Cook&#39;, &#39;[&#34;crispy rice&#34;,&#34;vanilla foam&#34;]&#39;, 6),&#xA;    (30, &#39;Sophia Rossi&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;,&#34;marshmallow&#34;]&#39;, 4),&#xA;    (31, &#39;Liam OConnor&#39;, &#39;[&#34;caramel drizzle&#34;]&#39;, 1),&#xA;    (32, &#39;Olivia Dubois&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;peppermint&#34;]&#39;, 2),&#xA;    (33, &#39;Emma Svensson&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 2),&#xA;    (34, &#39;Noah Fischer&#39;, &#39;[&#34;caramel drizzle&#34;]&#39;, 2),&#xA;    (35, &#39;William Becker&#39;, &#39;[&#34;crispy rice&#34;,&#34;dark chocolate&#34;]&#39;, 4),&#xA;    (36, &#39;Isabella Laurent&#39;, &#39;[&#34;dark chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 8),&#xA;    (37, &#39;James Kim&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;marshmallow&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;]&#39;, 7),&#xA;    (38, &#39;Mia Chen&#39;, &#39;[&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 1),&#xA;    (39, &#39;Benjamin Patel&#39;, &#39;[&#34;peppermint&#34;]&#39;, 7),&#xA;    (40, &#39;Charlotte Singh&#39;, &#39;[&#34;marshmallow&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]&#39;, 4),&#xA;    (41, &#39;Daniel Murphy&#39;, &#39;[&#34;cinnamon&#34;,&#34;vanilla foam&#34;,&#34;marshmallow&#34;,&#34;white chocolate&#34;]&#39;, 8),&#xA;    (42, &#39;Zoe Wilson&#39;, &#39;[&#34;marshmallow&#34;,&#34;dark chocolate&#34;]&#39;, 9),&#xA;    (43, &#39;Robert Smith&#39;, &#39;[&#34;peppermint&#34;]&#39;, 9),&#xA;    (44, &#39;Emily Johnson&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 4),&#xA;    (45, &#39;David Brown&#39;, &#39;[&#34;vanilla foam&#34;,&#34;dark chocolate&#34;]&#39;, 8),&#xA;    (46, &#39;Sarah Davis&#39;, &#39;[&#34;vanilla foam&#34;,&#34;peppermint&#34;]&#39;, 3),&#xA;    (47, &#39;James Simon&#39;, &#39;[&#34;peppermint&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;,&#34;shaved chocolate&#34;]&#39;, 5),&#xA;    (48, &#39;Linda Lee&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;vanilla foam&#34;]&#39;, 5),&#xA;    (49, &#39;Carlos Mendez&#39;, &#39;[&#34;peppermint&#34;,&#34;white chocolate&#34;]&#39;, 6),&#xA;    (50, &#39;Fatima Noor&#39;, &#39;[&#34;peppermint&#34;]&#39;, 8),&#xA;    (51, &#39;Youssef El-Sayed&#39;, &#39;[&#34;peppermint&#34;,&#34;marshmallow&#34;]&#39;, 3),&#xA;    (52, &#39;Ian Landsman&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 4),&#xA;    (53, &#39;Nolan Young&#39;, &#39;[&#34;marshmallow&#34;,&#34;shaved chocolate&#34;,&#34;crispy rice&#34;,&#34;vanilla foam&#34;]&#39;, 1),&#xA;    (54, &#39;Ava Martinez&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 9),&#xA;    (55, &#39;William Chen&#39;, &#39;[&#34;crispy rice&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 6),&#xA;    (56, &#39;Isabella Rodriguez&#39;, &#39;[&#34;crispy rice&#34;,&#34;vanilla foam&#34;]&#39;, 3),&#xA;    (57, &#39;Zachary Collins&#39;, &#39;[&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]&#39;, 2),&#xA;    (58, &#39;Audrey Edwards&#39;, &#39;[&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (59, &#39;Jason Stewart&#39;, &#39;[&#34;white chocolate&#34;]&#39;, 4),&#xA;    (60, &#39;Lucy Morris&#39;, &#39;[&#34;cinnamon&#34;,&#34;caramel drizzle&#34;,&#34;peppermint&#34;]&#39;, 4),&#xA;    (61, &#39;Cameron Rogers&#39;, &#39;[&#34;crispy rice&#34;,&#34;cinnamon&#34;,&#34;shaved chocolate&#34;]&#39;, 9),&#xA;    (62, &#39;Aria Blackwood&#39;, &#39;[&#34;white chocolate&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;]&#39;, 9),&#xA;    (63, &#39;Felix Whitmore&#39;, &#39;[&#34;marshmallow&#34;,&#34;cinnamon&#34;,&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (64, &#39;Luna Hartley&#39;, &#39;[&#34;white chocolate&#34;]&#39;, 3),&#xA;    (65, &#39;Jasper Thorne&#39;, &#39;[&#34;crispy rice&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 6),&#xA;    (66, &#39;Nora Calloway&#39;, &#39;[&#34;crispy rice&#34;,&#34;dark chocolate&#34;]&#39;, 5),&#xA;    (67, &#39;Silas Merrick&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 7),&#xA;    (68, &#39;Iris Pembroke&#39;, &#39;[&#34;peppermint&#34;,&#34;white chocolate&#34;,&#34;cinnamon&#34;]&#39;, 3),&#xA;    (69, &#39;Milo Ashford&#39;, &#39;[&#34;cinnamon&#34;,&#34;dark chocolate&#34;,&#34;crispy rice&#34;]&#39;, 7),&#xA;    (70, &#39;Clara Westbrook&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 2),&#xA;    (71, &#39;Owen Fairchild&#39;, &#39;[&#34;white chocolate&#34;,&#34;peppermint&#34;]&#39;, 6),&#xA;    (72, &#39;Ruby Hawthorne&#39;, &#39;[&#34;vanilla foam&#34;,&#34;cinnamon&#34;]&#39;, 1),&#xA;    (73, &#39;Finn Lockhart&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;peppermint&#34;,&#34;cinnamon&#34;]&#39;, 4),&#xA;    (74, &#39;Violet Sterling&#39;, &#39;[&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]&#39;, 9),&#xA;    (75, &#39;August Blackwell&#39;, &#39;[&#34;cinnamon&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;marshmallow&#34;]&#39;, 3),&#xA;    (76, &#39;Hazel Kincaid&#39;, &#39;[&#34;peppermint&#34;,&#34;cinnamon&#34;,&#34;caramel drizzle&#34;,&#34;dark chocolate&#34;]&#39;, 7),&#xA;    (77, &#39;Leo Greyson&#39;, &#39;[&#34;crispy rice&#34;,&#34;cinnamon&#34;]&#39;, 2),&#xA;    (78, &#39;Stella Beaumont&#39;, &#39;[&#34;peppermint&#34;,&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;vanilla foam&#34;]&#39;, 8),&#xA;    (79, &#39;Miles Brennan&#39;, &#39;[&#34;crispy rice&#34;,&#34;shaved chocolate&#34;,&#34;marshmallow&#34;]&#39;, 1),&#xA;    (80, &#39;Ivy Winslow&#39;, &#39;[&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;]&#39;, 5),&#xA;    (81, &#39;Jack Carmichael&#39;, &#39;[&#34;crispy rice&#34;]&#39;, 6),&#xA;    (82, &#39;Scarlett Dalton&#39;, &#39;[&#34;white chocolate&#34;,&#34;caramel drizzle&#34;,&#34;peppermint&#34;,&#34;vanilla foam&#34;]&#39;, 5),&#xA;    (83, &#39;Oliver Ashby&#39;, &#39;[&#34;crispy rice&#34;,&#34;peppermint&#34;]&#39;, 2),&#xA;    (84, &#39;Aurora Whitfield&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;crispy rice&#34;]&#39;, 9),&#xA;    (85, &#39;Noah Hastings&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 4),&#xA;    (86, &#39;Eliza Radcliffe&#39;, &#39;[&#34;peppermint&#34;,&#34;vanilla foam&#34;,&#34;white chocolate&#34;]&#39;, 9),&#xA;    (87, &#39;Liam Donovan&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]&#39;, 2),&#xA;    (88, &#39;Penelope Sinclair&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;marshmallow&#34;,&#34;white chocolate&#34;]&#39;, 5),&#xA;    (89, &#39;Ethan Marlowe&#39;, &#39;[&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;]&#39;, 5),&#xA;    (90, &#39;Charlotte Waverly&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;]&#39;, 7),&#xA;    (91, &#39;Lucas Prescott&#39;, &#39;[&#34;crispy rice&#34;,&#34;vanilla foam&#34;]&#39;, 9),&#xA;    (92, &#39;Amelia Rosewood&#39;, &#39;[&#34;crispy rice&#34;]&#39;, 5),&#xA;    (93, &#39;Henry Treadwell&#39;, &#39;[&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;]&#39;, 8),&#xA;    (94, &#39;Sophie Langford&#39;, &#39;[&#34;dark chocolate&#34;,&#34;shaved chocolate&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;]&#39;, 3),&#xA;    (95, &#39;Benjamin Fairweather&#39;, &#39;[&#34;crispy rice&#34;]&#39;, 9),&#xA;    (96, &#39;Grace Aldridge&#39;, &#39;[&#34;marshmallow&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 5),&#xA;    (97, &#39;Samuel Kingsley&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;cinnamon&#34;]&#39;, 1),&#xA;    (98, &#39;Eleanor Morrison&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 8),&#xA;    (99, &#39;Daniel Lockwood&#39;, &#39;[&#34;dark chocolate&#34;]&#39;, 7),&#xA;    (100, &#39;Lucy Harrington&#39;, &#39;[&#34;vanilla foam&#34;,&#34;dark chocolate&#34;]&#39;, 7),&#xA;    (101, &#39;Matthew Sutherland&#39;, &#39;[&#34;cinnamon&#34;,&#34;peppermint&#34;]&#39;, 2),&#xA;    (102, &#39;Emma Gilmore&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;cinnamon&#34;,&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;]&#39;, 9),&#xA;    (103, &#39;Alexander Stratton&#39;, &#39;[&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;white chocolate&#34;]&#39;, 8),&#xA;    (104, &#39;Abigail Worthington&#39;, &#39;[&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 8),&#xA;    (105, &#39;William Beauchamp&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]&#39;, 8),&#xA;    (106, &#39;Hannah Livingston&#39;, &#39;[&#34;crispy rice&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;]&#39;, 5),&#xA;    (107, &#39;James Garrison&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;,&#34;peppermint&#34;]&#39;, 5),&#xA;    (108, &#39;Sophia Brookshire&#39;, &#39;[&#34;crispy rice&#34;,&#34;caramel drizzle&#34;]&#39;, 6),&#xA;    (109, &#39;Theodore Hadley&#39;, &#39;[&#34;cinnamon&#34;]&#39;, 3),&#xA;    (110, &#39;Olivia Carrington&#39;, &#39;[&#34;vanilla foam&#34;,&#34;cinnamon&#34;]&#39;, 4),&#xA;    (111, &#39;Sebastian Ashworth&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 7),&#xA;    (112, &#39;Chloe Blackstone&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 1),&#xA;    (113, &#39;Nicholas Montague&#39;, &#39;[&#34;vanilla foam&#34;,&#34;white chocolate&#34;]&#39;, 1),&#xA;    (114, &#39;Madeline Ramsey&#39;, &#39;[&#34;dark chocolate&#34;,&#34;peppermint&#34;,&#34;cinnamon&#34;,&#34;vanilla foam&#34;]&#39;, 7),&#xA;    (115, &#39;Gabriel Winthrop&#39;, &#39;[&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;marshmallow&#34;]&#39;, 5),&#xA;    (116, &#39;Alice Merriweather&#39;, &#39;[&#34;dark chocolate&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;cinnamon&#34;]&#39;, 7),&#xA;    (117, &#39;Isaac Kendrick&#39;, &#39;[&#34;dark chocolate&#34;,&#34;cinnamon&#34;]&#39;, 9),&#xA;    (118, &#39;Lillian Holbrook&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 1),&#xA;    (119, &#39;Caleb Bellamy&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 3),&#xA;    (120, &#39;Rose Drummond&#39;, &#39;[&#34;cinnamon&#34;,&#34;peppermint&#34;,&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 6),&#xA;    (121, &#39;Elijah Wakefield&#39;, &#39;[&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 6),&#xA;    (122, &#39;Margaret Fairbanks&#39;, &#39;[&#34;crispy rice&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;,&#34;peppermint&#34;]&#39;, 8),&#xA;    (123, &#39;Julian Blackburn&#39;, &#39;[&#34;white chocolate&#34;]&#39;, 1),&#xA;    (124, &#39;Eva Templeton&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;marshmallow&#34;,&#34;vanilla foam&#34;]&#39;, 1),&#xA;    (125, &#39;Nathan Whitley&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 4),&#xA;    (126, &#39;Anna Westfield&#39;, &#39;[&#34;cinnamon&#34;]&#39;, 4),&#xA;    (127, &#39;Aaron Ashcroft&#39;, &#39;[&#34;dark chocolate&#34;,&#34;marshmallow&#34;]&#39;, 4),&#xA;    (128, &#39;Julia Pendleton&#39;, &#39;[&#34;crispy rice&#34;,&#34;caramel drizzle&#34;,&#34;marshmallow&#34;,&#34;white chocolate&#34;]&#39;, 2),&#xA;    (129, &#39;Connor Redmond&#39;, &#39;[&#34;crispy rice&#34;,&#34;marshmallow&#34;]&#39;, 1),&#xA;    (130, &#39;Grace Thornhill&#39;, &#39;[&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 4),&#xA;    (131, &#39;Zachary Stafford&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 2),&#xA;    (132, &#39;Caroline Bannister&#39;, &#39;[&#34;marshmallow&#34;,&#34;peppermint&#34;,&#34;cinnamon&#34;]&#39;, 9),&#xA;    (133, &#39;Dylan Blakely&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;]&#39;, 6),&#xA;    (134, &#39;Katherine Underwood&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 8),&#xA;    (135, &#39;Tyler Braddock&#39;, &#39;[&#34;vanilla foam&#34;]&#39;, 6),&#xA;    (136, &#39;Victoria Harwood&#39;, &#39;[&#34;cinnamon&#34;,&#34;vanilla foam&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]&#39;, 9),&#xA;    (137, &#39;Ryan Beckett&#39;, &#39;[&#34;white chocolate&#34;,&#34;dark chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 7),&#xA;    (138, &#39;Elizabeth Chesterfield&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;]&#39;, 2),&#xA;    (139, &#39;Jordan Waverly&#39;, &#39;[&#34;dark chocolate&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;]&#39;, 8),&#xA;    (140, &#39;Sarah Remington&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;cinnamon&#34;]&#39;, 3),&#xA;    (141, &#39;Brandon Locklear&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;]&#39;, 6),&#xA;    (142, &#39;Rachel Wyndham&#39;, &#39;[&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;white chocolate&#34;]&#39;, 4),&#xA;    (143, &#39;Logan Sherwood&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 7),&#xA;    (144, &#39;Amanda Fitzroy&#39;, &#39;[&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;,&#34;dark chocolate&#34;]&#39;, 8),&#xA;    (145, &#39;Jackson Thorpe&#39;, &#39;[&#34;peppermint&#34;,&#34;marshmallow&#34;,&#34;cinnamon&#34;,&#34;dark chocolate&#34;]&#39;, 7),&#xA;    (146, &#39;Rebecca Ashcombe&#39;, &#39;[&#34;crispy rice&#34;,&#34;caramel drizzle&#34;]&#39;, 8),&#xA;    (147, &#39;Cameron Gladstone&#39;, &#39;[&#34;vanilla foam&#34;,&#34;caramel drizzle&#34;,&#34;cinnamon&#34;]&#39;, 8),&#xA;    (148, &#39;Jessica Langston&#39;, &#39;[&#34;crispy rice&#34;,&#34;white chocolate&#34;,&#34;marshmallow&#34;]&#39;, 2),&#xA;    (149, &#39;Mason Fairmont&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;marshmallow&#34;]&#39;, 9),&#xA;    (150, &#39;Emily Claridge&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]&#39;, 8),&#xA;    (151, &#39;Hunter Bellingham&#39;, &#39;[&#34;white chocolate&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]&#39;, 4),&#xA;    (152, &#39;Laura Thornbury&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;,&#34;marshmallow&#34;]&#39;, 5),&#xA;    (153, &#39;Wyatt Alderton&#39;, &#39;[&#34;white chocolate&#34;]&#39;, 3),&#xA;    (154, &#39;Claire Berkshire&#39;, &#39;[&#34;peppermint&#34;,&#34;white chocolate&#34;,&#34;crispy rice&#34;]&#39;, 5),&#xA;    (155, &#39;Cole Ashland&#39;, &#39;[&#34;dark chocolate&#34;,&#34;marshmallow&#34;]&#39;, 1),&#xA;    (156, &#39;Diana Brightwell&#39;, &#39;[&#34;dark chocolate&#34;,&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 6),&#xA;    (157, &#39;Aiden Stanfield&#39;, &#39;[&#34;peppermint&#34;,&#34;crispy rice&#34;]&#39;, 8),&#xA;    (158, &#39;Natalie Warwick&#39;, &#39;[&#34;marshmallow&#34;]&#39;, 7),&#xA;    (159, &#39;Parker Blackmore&#39;, &#39;[&#34;marshmallow&#34;,&#34;peppermint&#34;,&#34;white chocolate&#34;,&#34;vanilla foam&#34;]&#39;, 5),&#xA;    (160, &#39;Morgan Steadman&#39;, &#39;[&#34;shaved chocolate&#34;]&#39;, 2),&#xA;    (161, &#39;Blake Dunwood&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;dark chocolate&#34;]&#39;, 5),&#xA;    (162, &#39;Taylor Woodridge&#39;, &#39;[&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;white chocolate&#34;,&#34;caramel drizzle&#34;]&#39;, 2),&#xA;    (163, &#39;Chase Ashbury&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;crispy rice&#34;]&#39;, 2),&#xA;    (164, &#39;Madison Clearwater&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;cinnamon&#34;]&#39;, 9),&#xA;    (165, &#39;Carter Brookfield&#39;, &#39;[&#34;cinnamon&#34;]&#39;, 1),&#xA;    (166, &#39;Ashley Fairhaven&#39;, &#39;[&#34;dark chocolate&#34;,&#34;white chocolate&#34;,&#34;cinnamon&#34;,&#34;peppermint&#34;]&#39;, 4),&#xA;    (167, &#39;Griffin Hartwell&#39;, &#39;[&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]&#39;, 4),&#xA;    (168, &#39;Megan Redfield&#39;, &#39;[&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;peppermint&#34;]&#39;, 9),&#xA;    (169, &#39;Grayson Westmore&#39;, &#39;[&#34;cinnamon&#34;,&#34;crispy rice&#34;]&#39;, 3),&#xA;    (170, &#39;Nicole Ashridge&#39;, &#39;[&#34;peppermint&#34;]&#39;, 3),&#xA;    (171, &#39;Sawyer Hollingsworth&#39;, &#39;[&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]&#39;, 8),&#xA;    (172, &#39;Alexis Thorndale&#39;, &#39;[&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;dark chocolate&#34;]&#39;, 9),&#xA;    (173, &#39;Declan Summerfield&#39;, &#39;[&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]&#39;, 7),&#xA;    (174, &#39;Samantha Brightwood&#39;, &#39;[&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;dark chocolate&#34;]&#39;, 4),&#xA;    (175, &#39;Tristan Ashbrook&#39;, &#39;[&#34;crispy rice&#34;]&#39;, 1),&#xA;    (176, &#39;Melissa Ravenscroft&#39;, &#39;[&#34;dark chocolate&#34;,&#34;white chocolate&#34;]&#39;, 5),&#xA;    (177, &#39;Colton Hawthorne&#39;, &#39;[&#34;vanilla foam&#34;,&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (178, &#39;Lauren Silverton&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;,&#34;dark chocolate&#34;]&#39;, 2),&#xA;    (179, &#39;Landon Whitworth&#39;, &#39;[&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]&#39;, 8),&#xA;    (180, &#39;Kayla Mansfield&#39;, &#39;[&#34;vanilla foam&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;]&#39;, 2);&#xA;&#xA;INSERT INTO cocoa_cars (car_id, available_mixins, total_stock) VALUES&#xA;    (5, &#39;[&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]&#39;, 412),&#xA;    (2, &#39;[&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]&#39;, 359),&#xA;    (9, &#39;[&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;]&#39;, 354),&#xA;    (4, &#39;[&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]&#39;, 338),&#xA;    (8, &#39;[&#34;vanilla foam&#34;,&#34;marshmallow&#34;]&#39;, 263),&#xA;    (1, &#39;[&#34;peppermint&#34;,&#34;crispy rice&#34;]&#39;, 205),&#xA;    (6, &#39;[&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;cinnamon&#34;,&#34;peppermint&#34;]&#39;, 161),&#xA;    (7, &#39;[&#34;caramel drizzle&#34;,&#34;crispy rice&#34;,&#34;marshmallow&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;]&#39;, 132),&#xA;    (3, &#39;[&#34;vanilla foam&#34;,&#34;peppermint&#34;]&#39;, 95);&#xA;&#xA;```&#xA;&#xA;Changing `TEXT[]` to `TEXT` is not required, as SQLITE really doesn&#39;t bother with the types unless the table is strict. We&#39;ll take that flexibility here.&#xA;&#xA;## Problem&#xA;&#xA;&gt; Get the stewards a list of all the passengers and the cocoa car(s) they can be served from that has at least one of their favorite mixins.&#xA;&gt; &#xA;&gt; Remember only the top three most-stocked cocoa cars remained operational, so the passengers must be served from one of those cars.&#xA;&#xA;Ok so, we have two tables.&#xA;&#xA;1. `cocoa_cars`&#xA;2. `passengers`&#xA;&#xA;What we need to do is to list down the passengers which the `cocoa_cars` can satisfy their atleast one mixin(food) with their car_ids. So, there are limited cars, just 9 cars in total right?&#xA;&#xA;```sql&#xA;SELECT * FROM cocoa_cars ORDER BY car_id;&#xA;```&#xA;So, there are only 9 cars.&#xA;```&#xA;sqlite&gt; SELECT * FROM cocoa_cars ORDER BY car_id;&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| car_id |                       available_mixins                       | total_stock |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 1      | [&#34;peppermint&#34;,&#34;crispy rice&#34;]                                 | 205         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 3      | [&#34;vanilla foam&#34;,&#34;peppermint&#34;]                                | 95          |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 4      | [&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]                       | 338         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 6      | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;cinnamon | 161         |&#xA;|        | &#34;,&#34;peppermint&#34;]                                              |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 7      | [&#34;caramel drizzle&#34;,&#34;crispy rice&#34;,&#34;marshmallow&#34;,&#34;vanilla foam | 132         |&#xA;|        | &#34;,&#34;cinnamon&#34;]                                                |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 8      | [&#34;vanilla foam&#34;,&#34;marshmallow&#34;]                               | 263         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         |&#xA;|        | ate&#34;]                                                        |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;However, the question states, we only need to consider the top three most stacked cars. Now we can order by `total_stack` here and find the top stacked 3 cars with the `LIMIT 3` clause.&#xA;&#xA;```sql&#xA;SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; &#xA;sqlite&gt; select * from cocoa_cars ORDER BY total_stock DESC;&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| car_id |                       available_mixins                       | total_stock |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         |&#xA;|        | ate&#34;]                                                        |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 4      | [&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]                       | 338         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 8      | [&#34;vanilla foam&#34;,&#34;marshmallow&#34;]                               | 263         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 1      | [&#34;peppermint&#34;,&#34;crispy rice&#34;]                                 | 205         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 6      | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;cinnamon | 161         |&#xA;|        | &#34;,&#34;peppermint&#34;]                                              |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 7      | [&#34;caramel drizzle&#34;,&#34;crispy rice&#34;,&#34;marshmallow&#34;,&#34;vanilla foam | 132         |&#xA;|        | &#34;,&#34;cinnamon&#34;]                                                |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 3      | [&#34;vanilla foam&#34;,&#34;peppermint&#34;]                                | 95          |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;sqlite&gt; select * from cocoa_cars ORDER BY total_stock DESC LIMIT 3;&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| car_id |                       available_mixins                       | total_stock |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         |&#xA;|        | ate&#34;]                                                        |             |&#xA;+--------+--------------------------------------------------------------+-------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Now, we have all the cars that we can start assigning the passengers.&#xA;How?&#xA;&#xA;We need to select and assign the passenger the `car_id` which contains one or more of their `favourite_mixins`.&#xA;Now, this is the tricky part.&#xA;&#xA;We are in SQLite!&#xA;&#xA;We have a `favourite_list` for `Mateo Cruz` as a string like `[&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]` which we need to match against these 3 car `available_mixins`:&#xA;- `[&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]` on car_id `5`&#xA;- `[&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]` on car_id `2`&#xA;- `[&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;]` on car_id `9`&#xA;&#xA;&#xA;So here all three cars have at least one right?&#xA;- car_id `5` has `shaved_chocolate`&#xA;- car_id `2` has `caramel drizzle`&#xA;- car_id `9` has 2 of his 3 `favourite_list` dishes `caramel drizzle`, and `shaved chocolate&#34;`.&#xA;&#xA;So we should ideally return for `Matro Cruz` the car_ids `[5, 2, 9]` or as separate rows doesn&#39;t matter as much I think. But the first one looks cool!&#xA;&#xA;So, how do we do it?&#xA;&#xA;First let&#39;s start with what we had!&#xA;&#xA;The top 3 cars as&#xA;&#xA;```sql&#xA;SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3;&#xA;```&#xA;&#xA;This gives us the table that we can use to compare for each car_id if we have any one of the `favourite_list` for one passenger at a time.&#xA;&#xA;But how do we split the string of mixins and favourite_list?&#xA;&#xA;We can use the [json_each](https://sqlite.org/json1.html#jeach) function which takes in any valid json string (could be raw, could be a column name) for each row of the the table. And it returns back a lot of things.&#xA;&#xA;Let&#39;s just try to select everything from the `json_each` with the favourite_mixins column.&#xA;&#xA;```sql&#xA;SELECT * FROM json_each(favorite_mixins) FROM passengers;&#xA;```&#xA;&#xA;```sqlite&gt; SELECT * FROM json_each(favorite_mixins) FROM passengers;&#xA;Parse error: near &#34;FROM&#34;: syntax error&#xA;  SELECT * FROM json_each(favorite_mixins) FROM passengers;&#xA;                             error here ---^&#xA;sqlite&gt; &#xA;```&#xA;Ops!&#xA;Why?&#xA;Because we have given it the whole column, it can only take one cell at a time, so we need to give it that cell for each row.&#xA;&#xA;&#xA;```sql&#xA;SELECT * FROM json_each((SELECT favorite_mixins FROM passengers));&#xA;```&#xA;&#xA;Here we try to give it only the `favorite_mixins` from the passengers column.&#xA;&#xA;```&#xA;sqlite&gt; SELECT * FROM json_each((SELECT favorite_mixins FROM passengers));&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;| key |    value     | type |     atom     | id | parent | fullkey | path |&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;| 0   | vanilla foam | text | vanilla foam | 2  |        | $[0]    | $    |&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;sqlite&gt; select * from passengers limit 5;&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| passenger_id | passenger_name |                       favorite_mixins                        | car_id |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 1            | Ava Johnson    | [&#34;vanilla foam&#34;]                                             | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 2            | Mateo Cruz     | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 3            | Nia Grant      | [&#34;shaved chocolate&#34;]                                         | 5      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 4            | Hiro Tanaka    | [&#34;shaved chocolate&#34;]                                         | 2      |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;| 5            | Layla Brooks   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;cinnamon&#34; | 3      |&#xA;|              |                | ]                                                            |        |&#xA;+--------------+----------------+--------------------------------------------------------------+--------+&#xA;sqlite&gt; &#xA;```&#xA;But it only gave it for the first column, unluckily it had just one mixin.&#xA;&#xA;What is happening?&#xA;&#xA;Well!&#xA;&#xA;This actually selects only the first row&#39;s `favorite_mixins` because `json_each()` processes one value at a time. Since favorite_mixins is a JSON array, SQLite expects a single array value per row. When we try to pass the entire column, it only processes the first row of `favorite_mixins`&#xA;&#xA;Let&#39;s try to use the json_each for passengers with more than one favorite mixins.&#xA;&#xA;```sql&#xA;SELECT * &#xA;FROM json_each(&#xA;     (SELECT favorite_mixins FROM passengers WHERE passenger_name = &#39;Mateo Cruz&#39;)&#xA;);&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; SELECT * &#xA;FROM json_each((SELECT favorite_mixins FROM passengers WHERE passenger_name = &#39;Ava Johnson&#39;));&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;| key |    value     | type |     atom     | id | parent | fullkey | path |&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;| 0   | vanilla foam | text | vanilla foam | 2  |        | $[0]    | $    |&#xA;+-----+--------------+------+--------------+----+--------+---------+------+&#xA;sqlite&gt; SELECT * &#xA;FROM json_each((SELECT favorite_mixins FROM passengers WHERE passenger_name = &#39;Mateo Cruz&#39;));&#xA;+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| key |      value       | type |       atom       | id | parent | fullkey | path |&#xA;+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 0   | caramel drizzle  | text | caramel drizzle  | 2  |        | $[0]    | $    |&#xA;| 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;| 2   | white chocolate  | text | white chocolate  | 37 |        | $[2]    | $    |&#xA;+-----+------------------+------+------------------+----+--------+---------+------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Ok now that is neat, it returned 3 rows for the 3 favorite items for the passenger `Mateo Cruz`&#xA;&#xA;Now what?&#xA;&#xA;How do we get it for all the passengers? &#xA;&#xA;How about JOINs, since the passenger&#39;s data will remain the same, we just change the mixins for each of their favorite list.&#xA;&#xA;&#xA;```sql&#xA;SELECT *&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) AS mixin &#xA;ORDER BY passengers.passenger_name;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; sqlite&gt; SELECT *&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) AS mixin &#xA;ORDER BY passengers.passenger_name;&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| passenger_id |     passenger_name     |                       favorite_mixins                        | car_id | key |      value       | type |       atom       | id | parent | fullkey | path |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 127          | Aaron Ashcroft         | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;]                             | 4      | 0   | dark chocolate   | text | dark chocolate   | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 127          | Aaron Ashcroft         | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;]                             | 4      | 1   | marshmallow      | text | marshmallow      | 18 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 104          | Abigail Worthington    | [&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;]                         | 8      | 0   | dark chocolate   | text | dark chocolate   | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 104          | Abigail Worthington    | [&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;]                         | 8      | 1   | caramel drizzle  | text | caramel drizzle  | 18 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 157          | Aiden Stanfield        | [&#34;peppermint&#34;,&#34;crispy rice&#34;]                                 | 8      | 0   | peppermint       | text | peppermint       | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 157          | Aiden Stanfield        | [&#34;peppermint&#34;,&#34;crispy rice&#34;]                                 | 8      | 1   | crispy rice      | text | crispy rice      | 13 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 103          | Alexander Stratton     | [&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;white choco | 8      | 0   | peppermint       | text | peppermint       | 2  |        | $[0]    | $    |&#xA;|              |                        | late&#34;]                                                       |        |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 103          | Alexander Stratton     | [&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;white choco | 8      | 1   | shaved chocolate | text | shaved chocolate | 13 |        | $[1]    | $    |&#xA;|              |                        | late&#34;]                                                       |        |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 103          | Alexander Stratton     | [&#34;peppermint&#34;,&#34;shaved chocolate&#34;,&#34;vanilla foam&#34;,&#34;white choco | 8      | 2   | vanilla foam     | text | vanilla foam     | 31 |        | $[2]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;...&#xA;...&#xA;| 153          | Wyatt Alderton         | [&#34;white chocolate&#34;]                                          | 3      | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 15           | Yara Haddad            | [&#34;white chocolate&#34;,&#34;dark chocolate&#34;]                         | 2      | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 15           | Yara Haddad            | [&#34;white chocolate&#34;,&#34;dark chocolate&#34;]                         | 2      | 1   | dark chocolate   | text | dark chocolate   | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 51           | Youssef El-Sayed       | [&#34;peppermint&#34;,&#34;marshmallow&#34;]                                 | 3      | 0   | peppermint       | text | peppermint       | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 51           | Youssef El-Sayed       | [&#34;peppermint&#34;,&#34;marshmallow&#34;]                                 | 3      | 1   | marshmallow      | text | marshmallow      | 13 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 57           | Zachary Collins        | [&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]          | 2      | 0   | dark chocolate   | text | dark chocolate   | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 57           | Zachary Collins        | [&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]          | 2      | 1   | caramel drizzle  | text | caramel drizzle  | 18 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 57           | Zachary Collins        | [&#34;dark chocolate&#34;,&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]          | 2      | 2   | vanilla foam     | text | vanilla foam     | 35 |        | $[2]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 131          | Zachary Stafford       | [&#34;shaved chocolate&#34;]                                         | 2      | 0   | shaved chocolate | text | shaved chocolate | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 11           | Zara Sheikh            | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]                  | 4      | 0   | vanilla foam     | text | vanilla foam     | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 11           | Zara Sheikh            | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]                  | 4      | 1   | crispy rice      | text | crispy rice      | 16 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 11           | Zara Sheikh            | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]                  | 4      | 2   | peppermint       | text | peppermint       | 28 |        | $[2]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 42           | Zoe Wilson             | [&#34;marshmallow&#34;,&#34;dark chocolate&#34;]                             | 9      | 0   | marshmallow      | text | marshmallow      | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 42           | Zoe Wilson             | [&#34;marshmallow&#34;,&#34;dark chocolate&#34;]                             | 9      | 1   | dark chocolate   | text | dark chocolate   | 14 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;&#xA;```&#xA;&#xA;It just automagically joins the relevant rows for the expanded rows from the json_each when we pass the column which is a json list.&#xA;&#xA;But we don&#39;t want everything, we just want the passenger and the mixin names. And let&#39;s also include the index just to see the data.&#xA;&#xA;```sql&#xA;SELECT passengers.passenger_name, mixin.key AS mixin_index, mixin.value AS mixin&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) AS mixin&#xA;ORDER BY passengers.passenger_name;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT passengers.passenger_name, mixin.key AS mixin_index, mixin.value AS mixin&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) AS mixin&#xA;ORDER BY passengers.passenger_name;&#xA;+------------------------+-------------+------------------+&#xA;|     passenger_name     | mixin_index |      mixin       |&#xA;+------------------------+-------------+------------------+&#xA;| Aaron Ashcroft         | 0           | dark chocolate   |&#xA;| Aaron Ashcroft         | 1           | marshmallow      |&#xA;| Abigail Worthington    | 0           | dark chocolate   |&#xA;| Abigail Worthington    | 1           | caramel drizzle  |&#xA;| Aiden Stanfield        | 0           | peppermint       |&#xA;| Aiden Stanfield        | 1           | crispy rice      |&#xA;| Alexander Stratton     | 0           | peppermint       |&#xA;| Alexander Stratton     | 1           | shaved chocolate |&#xA;| Alexander Stratton     | 2           | vanilla foam     |&#xA;| Alexander Stratton     | 3           | white chocolate  |&#xA;...&#xA;...&#xA;| Yara Haddad            | 1           | dark chocolate   |&#xA;| Youssef El-Sayed       | 0           | peppermint       |&#xA;| Youssef El-Sayed       | 1           | marshmallow      |&#xA;| Zachary Collins        | 0           | dark chocolate   |&#xA;| Zachary Collins        | 1           | caramel drizzle  |&#xA;| Zachary Collins        | 2           | vanilla foam     |&#xA;| Zachary Stafford       | 0           | shaved chocolate |&#xA;| Zara Sheikh            | 0           | vanilla foam     |&#xA;| Zara Sheikh            | 1           | crispy rice      |&#xA;| Zara Sheikh            | 2           | peppermint       |&#xA;| Zoe Wilson             | 0           | marshmallow      |&#xA;| Zoe Wilson             | 1           | dark chocolate   |&#xA;+------------------------+-------------+------------------+&#xA;&#xA;```&#xA;&#xA;That looks good.&#xA;&#xA;Now what?&#xA;&#xA;We also need to do it for the cocoa_cars with `available_mixins`&#xA;&#xA;```sql&#xA;SELECT &#xA;    *&#xA;FROM cocoa_cars &#xA;JOIN &#xA;    json_each(available_mixins) AS mixin&#xA;ORDER BY total_stock DESC;&#xA;```&#xA;OR&#xA;&#xA;```sql&#xA;SELECT &#xA;    car_id,&#xA;    mixin.key AS mixin_index,&#xA;    mixin.value AS mixin &#xA;FROM cocoa_cars &#xA;JOIN&#xA;    json_each(available_mixins) as mixin &#xA;ORDER BY total_stock DESC;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT car_id, mixin.key as mixin_index, mixin.value as mixin from cocoa_cars join json_each(available_mixins) as mixin ORDER BY total_stock DESC;&#xA;+--------+-------------+------------------+&#xA;| car_id | mixin_index |      mixin       |&#xA;+--------+-------------+------------------+&#xA;| 5      | 0           | white chocolate  |&#xA;| 5      | 1           | shaved chocolate |&#xA;| 2      | 0           | cinnamon         |&#xA;| 2      | 1           | marshmallow      |&#xA;| 2      | 2           | caramel drizzle  |&#xA;| 9      | 0           | crispy rice      |&#xA;| 9      | 1           | peppermint       |&#xA;| 9      | 2           | caramel drizzle  |&#xA;| 9      | 3           | shaved chocolate |&#xA;| 4      | 0           | shaved chocolate |&#xA;| 4      | 1           | white chocolate  |&#xA;| 8      | 0           | vanilla foam     |&#xA;| 8      | 1           | marshmallow      |&#xA;| 1      | 0           | peppermint       |&#xA;| 1      | 1           | crispy rice      |&#xA;| 6      | 0           | shaved chocolate |&#xA;| 6      | 1           | dark chocolate   |&#xA;| 6      | 2           | crispy rice      |&#xA;| 6      | 3           | cinnamon         |&#xA;| 6      | 4           | peppermint       |&#xA;| 7      | 0           | caramel drizzle  |&#xA;| 7      | 1           | crispy rice      |&#xA;| 7      | 2           | marshmallow      |&#xA;| 7      | 3           | vanilla foam     |&#xA;| 7      | 4           | cinnamon         |&#xA;| 3      | 0           | vanilla foam     |&#xA;| 3      | 1           | peppermint       |&#xA;+--------+-------------+------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;But hold on! We only needed it for the top 3 stocked cars.&#xA;&#xA;So, how do we do it, we can (bad and dirty practise) limit by the count of the numbers of mixin in 5, 2 and 9 but that is bad.&#xA;&#xA;We need to dynamically get this table. &#xA;&#xA;Help in! CTEs&#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT &#xA;    *&#xA;FROM stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins);&#xA;```&#xA;&#xA;OR&#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT &#xA;    car_id,&#xA;    car_mixins.key as car_mixin_index,&#xA;    car_mixins.value as car_mixin&#xA;FROM stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) as car_mixins;&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH stocked_cars as (&#xA;(x1...&gt; select * from cocoa_cars ORDER BY total_stock DESC LIMIT 3)&#xA;   ...&gt; select * from stocked_cars JOIN json_each(stocked_cars.available_mixins);&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| car_id |                       available_mixins                       | total_stock | key |      value       | type |       atom       | id | parent | fullkey | path |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         | 0   | cinnamon         | text | cinnamon         | 2  |        | $[0]    | $    |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         | 1   | marshmallow      | text | marshmallow      | 11 |        | $[1]    | $    |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 2      | [&#34;cinnamon&#34;,&#34;marshmallow&#34;,&#34;caramel drizzle&#34;]                 | 359         | 2   | caramel drizzle  | text | caramel drizzle  | 23 |        | $[2]    | $    |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    |&#xA;|        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 2   | caramel drizzle  | text | caramel drizzle  | 25 |        | $[2]    | $    |&#xA;|        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 3   | shaved chocolate | text | shaved chocolate | 42 |        | $[3]    | $    |&#xA;|        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;sqlite&gt; &#xA;sqlite&gt; WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT &#xA;    car_id,&#xA;    car_mixins.key as car_mixin_index,&#xA;    car_mixins.value as car_mixin&#xA;FROM stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) as car_mixins;&#xA;+--------+-----------------+------------------+&#xA;| car_id | car_mixin_index |    car_mixin     |&#xA;+--------+-----------------+------------------+&#xA;| 5      | 0               | white chocolate  |&#xA;| 5      | 1               | shaved chocolate |&#xA;| 2      | 0               | cinnamon         |&#xA;| 2      | 1               | marshmallow      |&#xA;| 2      | 2               | caramel drizzle  |&#xA;| 9      | 0               | crispy rice      |&#xA;| 9      | 1               | peppermint       |&#xA;| 9      | 2               | caramel drizzle  |&#xA;| 9      | 3               | shaved chocolate |&#xA;+--------+-----------------+------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Now what next?&#xA;&#xA;We simply need to combine all of it&#xA;&#xA;- Grab the top 3 cars &#xA;- Grab the passenger favorite mixins (expand with json_each)&#xA;- Grab the car available mixins (expand with json_each and use the top 3 car as cte)&#xA;- JOIN them when the car_mixin has one mixin from a passenger favorite mixin.&#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT passengers.passenger_name, stocked_cars.car_id&#xA;FROM passengers  &#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars  &#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;```&#xA;&#xA;OR &#xA;&#xA;Select some more rows for visual confirmation.&#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT passengers.passenger_name, passengers.favorite_mixins, stocked_cars.available_mixins, passenger_mixin.value as passenger_mixin, available_mixin.value as available_mixin, stocked_cars.car_id&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;&#xA;```&#xA;&#xA;OR&#xA;&#xA;select everything &#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT *&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;&#xA;```&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT passengers.passenger_name, stocked_cars.car_id&#xA;FROM passengers  &#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars  &#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;&#xA;+------------------------+--------+&#xA;|     passenger_name     | car_id |&#xA;+------------------------+--------+&#xA;| Mateo Cruz             | 5      |&#xA;| Mateo Cruz             | 5      |&#xA;| Nia Grant              | 5      |&#xA;| Hiro Tanaka            | 5      |&#xA;| Ravi Patel             | 5      |&#xA;| Ravi Patel             | 5      |&#xA;| Elena Morales          | 5      |&#xA;| Elena Morales          | 5      |&#xA;| Diego Ramos            | 5      |&#xA;| Caleb Osei             | 5      |&#xA;| Caleb Osei             | 5      |&#xA;| Lucas Ford             | 5      |&#xA;| Yara Haddad            | 5      |&#xA;| Tariq Hassan           | 5      |&#xA;| Eva Schmidt            | 5      |&#xA;| Ingrid Nilsen          | 5      |&#xA;| Sophia Rossi           | 5      |&#xA;| Olivia Dubois          | 5      |&#xA;| Emma Svensson          | 5      |&#xA;| Isabella Laurent       | 5      |&#xA;| James Kim              | 5      |&#xA;...&#xA;...&#xA;| Griffin Hartwell       | 9      |&#xA;| Griffin Hartwell       | 9      |&#xA;| Megan Redfield         | 9      |&#xA;| Megan Redfield         | 9      |&#xA;| Grayson Westmore       | 9      |&#xA;| Nicole Ashridge        | 9      |&#xA;| Sawyer Hollingsworth   | 9      |&#xA;| Sawyer Hollingsworth   | 9      |&#xA;| Alexis Thorndale       | 9      |&#xA;| Declan Summerfield     | 9      |&#xA;| Declan Summerfield     | 9      |&#xA;| Samantha Brightwood    | 9      |&#xA;| Samantha Brightwood    | 9      |&#xA;| Tristan Ashbrook       | 9      |&#xA;| Lauren Silverton       | 9      |&#xA;| Landon Whitworth       | 9      |&#xA;| Kayla Mansfield        | 9      |&#xA;| Kayla Mansfield        | 9      |&#xA;+------------------------+--------+&#xA;sqlite&gt;&#xA;&#xA;&#xA;sqlite&gt; WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT passengers.passenger_name, passengers.favorite_mixins, stocked_cars.available_mixins, passenger_mixin.value as passenger_mixin, available_mixin.value as available_mixin, stocked_cars.car_id&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;|     passenger_name     |                       favorite_mixins                        |                       available_mixins                       | passenger_mixin  | available_mixin  | car_id |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Mateo Cruz             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Mateo Cruz             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Nia Grant              | [&#34;shaved chocolate&#34;]                                         | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Hiro Tanaka            | [&#34;shaved chocolate&#34;]                                         | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Ravi Patel             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Ravi Patel             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Elena Morales          | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Elena Morales          | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]     | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Diego Ramos            | [&#34;shaved chocolate&#34;]                                         | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Caleb Osei             | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;white chocolate&#34;]      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | shaved chocolate | shaved chocolate | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Caleb Osei             | [&#34;shaved chocolate&#34;,&#34;dark chocolate&#34;,&#34;white chocolate&#34;]      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Lucas Ford             | [&#34;vanilla foam&#34;,&#34;white chocolate&#34;,&#34;cinnamon&#34;]                | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Yara Haddad            | [&#34;white chocolate&#34;,&#34;dark chocolate&#34;]                         | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Tariq Hassan           | [&#34;dark chocolate&#34;,&#34;crispy rice&#34;,&#34;white chocolate&#34;,&#34;peppermin | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;|                        | t&#34;]                                                          |                                                              |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Eva Schmidt            | [&#34;white chocolate&#34;,&#34;marshmallow&#34;]                            | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | white chocolate  | white chocolate  | 5      |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;...&#xA;...&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Sawyer Hollingsworth   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]                | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | crispy rice      | crispy rice      | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Sawyer Hollingsworth   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]                | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | peppermint       | peppermint       | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Alexis Thorndale       | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;dark chocolate&#34;]              | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | crispy rice      | crispy rice      | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Declan Summerfield     | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]  | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | crispy rice      | crispy rice      | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Declan Summerfield     | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]  | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | peppermint       | peppermint       | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Samantha Brightwood    | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;dark chocolate&#34;]                | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | crispy rice      | crispy rice      | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Samantha Brightwood    | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;dark chocolate&#34;]                | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | peppermint       | peppermint       | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Tristan Ashbrook       | [&#34;crispy rice&#34;]                                              | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | crispy rice      | crispy rice      | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Lauren Silverton       | [&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;,&#34;dark chocolate | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | caramel drizzle  | caramel drizzle  | 9      |&#xA;|                        | &#34;]                                                           | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Landon Whitworth       | [&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]                           | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | caramel drizzle  | caramel drizzle  | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Kayla Mansfield        | [&#34;vanilla foam&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;]             | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | peppermint       | peppermint       | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;| Kayla Mansfield        | [&#34;vanilla foam&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;]             | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | shaved chocolate | shaved chocolate | 9      |&#xA;|                        |                                                              | ate&#34;]                                                        |                  |                  |        |&#xA;+------------------------+--------------------------------------------------------------+--------------------------------------------------------------+------------------+------------------+--------+&#xA;sqlite&gt; &#xA;&#xA;&#xA;&#xA;&#xA;&#xA;sqlite&gt; WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT *                                             &#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value;&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| passenger_id |     passenger_name     |                       favorite_mixins                        | car_id | key |      value       | type |       atom       | id | parent | fullkey | path | car_id |                       available_mixins                       | total_stock | key |      value       | type |       atom       | id | parent | fullkey | path |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 2            | Mateo Cruz             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 2      | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 2            | Mateo Cruz             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 2      | 2   | white chocolate  | text | white chocolate  | 37 |        | $[2]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 3            | Nia Grant              | [&#34;shaved chocolate&#34;]                                         | 5      | 0   | shaved chocolate | text | shaved chocolate | 2  |        | $[0]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 4            | Hiro Tanaka            | [&#34;shaved chocolate&#34;]                                         | 2      | 0   | shaved chocolate | text | shaved chocolate | 2  |        | $[0]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 6            | Ravi Patel             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 5      | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 6            | Ravi Patel             | [&#34;caramel drizzle&#34;,&#34;shaved chocolate&#34;,&#34;white chocolate&#34;]     | 5      | 2   | white chocolate  | text | white chocolate  | 37 |        | $[2]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9            | Elena Morales          | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]     | 6      | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 0   | white chocolate  | text | white chocolate  | 2  |        | $[0]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 9            | Elena Morales          | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;,&#34;caramel drizzle&#34;]     | 6      | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 10           | Diego Ramos            | [&#34;shaved chocolate&#34;]                                         | 1      | 0   | shaved chocolate | text | shaved chocolate | 2  |        | $[0]    | $    | 5      | [&#34;white chocolate&#34;,&#34;shaved chocolate&#34;]                       | 412         | 1   | shaved chocolate | text | shaved chocolate | 19 |        | $[1]    | $    |&#xA;...&#xA;...&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 171          | Sawyer Hollingsworth   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]                | 8      | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 171          | Sawyer Hollingsworth   | [&#34;crispy rice&#34;,&#34;dark chocolate&#34;,&#34;peppermint&#34;]                | 8      | 2   | peppermint       | text | peppermint       | 30 |        | $[2]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 172          | Alexis Thorndale       | [&#34;vanilla foam&#34;,&#34;crispy rice&#34;,&#34;dark chocolate&#34;]              | 9      | 1   | crispy rice      | text | crispy rice      | 16 |        | $[1]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 173          | Declan Summerfield     | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]  | 7      | 2   | crispy rice      | text | crispy rice      | 30 |        | $[2]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 173          | Declan Summerfield     | [&#34;dark chocolate&#34;,&#34;marshmallow&#34;,&#34;crispy rice&#34;,&#34;peppermint&#34;]  | 7      | 3   | peppermint       | text | peppermint       | 42 |        | $[3]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 174          | Samantha Brightwood    | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;dark chocolate&#34;]                | 4      | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 174          | Samantha Brightwood    | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;dark chocolate&#34;]                | 4      | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 175          | Tristan Ashbrook       | [&#34;crispy rice&#34;]                                              | 1      | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 0   | crispy rice      | text | crispy rice      | 2  |        | $[0]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 178          | Lauren Silverton       | [&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;,&#34;cinnamon&#34;,&#34;dark chocolate | 2      | 0   | caramel drizzle  | text | caramel drizzle  | 2  |        | $[0]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 2   | caramel drizzle  | text | caramel drizzle  | 25 |        | $[2]    | $    |&#xA;|              |                        | &#34;]                                                           |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 179          | Landon Whitworth       | [&#34;caramel drizzle&#34;,&#34;vanilla foam&#34;]                           | 8      | 0   | caramel drizzle  | text | caramel drizzle  | 2  |        | $[0]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 2   | caramel drizzle  | text | caramel drizzle  | 25 |        | $[2]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 180          | Kayla Mansfield        | [&#34;vanilla foam&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;]             | 2      | 1   | peppermint       | text | peppermint       | 16 |        | $[1]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 1   | peppermint       | text | peppermint       | 14 |        | $[1]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;| 180          | Kayla Mansfield        | [&#34;vanilla foam&#34;,&#34;peppermint&#34;,&#34;shaved chocolate&#34;]             | 2      | 2   | shaved chocolate | text | shaved chocolate | 27 |        | $[2]    | $    | 9      | [&#34;crispy rice&#34;,&#34;peppermint&#34;,&#34;caramel drizzle&#34;,&#34;shaved chocol | 354         | 3   | shaved chocolate | text | shaved chocolate | 42 |        | $[3]    | $    |&#xA;|              |                        |                                                              |        |     |                  |      |                  |    |        |         |      |        | ate&#34;]                                                        |             |     |                  |      |                  |    |        |         |      |&#xA;+--------------+------------------------+--------------------------------------------------------------+--------+-----+------------------+------+------------------+----+--------+---------+------+--------+--------------------------------------------------------------+-------------+-----+------------------+------+------------------+----+--------+---------+------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Now what we did here?&#xA;- Create the top 3 stocked cars as CTE&#xA;- The main query first fetches each passenger with its favorite mixin expanded &#xA;- It joined the table of available_mixin when?&#xA;- The mixin from passenger is equal to the mixin in the available mixin in the car&#xA;- Hence we get the car_id as the mixin served for that passenger.&#xA;&#xA;You can see we have multiple rows for each user, we might not want that. Though nothing is wrong with it, but the report looks quite long, especially if there were more than a couple of mixins for the passenger or the number of cars were more.&#xA;&#xA;We can group by the `passenger_name` and sort of concatenate the &#xA;&#xA;```sql&#xA;WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT &#xA;     passengers.passenger_name,&#xA;     &#39;[&#39; || GROUP_CONCAT(DISTINCT stocked_cars.car_id) || &#39;]&#39; AS cocoa_cars&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value&#xA;GROUP BY passengers.passenger_name;&#xA;```&#xA;&#xA;Here, we have added,&#xA;&#xA;```&#xA; &#39;[&#39; || GROUP_CONCAT(DISTINCT stocked_cars.car_id) || &#39;]&#39; &#xA;```&#xA;&#xA;AND to group by &#xA;&#xA;```&#xA;GROUP BY passengers.passenger_name;&#xA;```&#xA;&#xA;So this will squish down all the separate rows of the passenger favorite mixins and the cars that have it, with a single row and we defined how we want to squish the different `car_id`s with a [GROUP_CONCAT](https://www.sqlite.org/lang_aggfunc.html#group_concat). This function can concatentate (join or attach or combine) together the multiple strings with a specific separator (by default the separate is `,`).&#xA;We also use the `||` concatenation operator to add `[` at the start and `]` at the end of the list of `car_ids`.  &#xA;&#xA;&gt; The group_concat() function returns a string which is the concatenation of all non-NULL values of X. If parameter Y is present then it is used as the separator between instances of X. A comma (&#34;,&#34;) is used as the separator if Y is omitted.&#xA;&#xA;&#xA;&#xA;```&#xA;sqlite&gt; WITH stocked_cars as (&#xA;    SELECT * FROM cocoa_cars ORDER BY total_stock DESC LIMIT 3&#xA;)&#xA;SELECT passengers.passenger_name, &#39;[&#39; || GROUP_CONCAT(DISTINCT stocked_cars.car_id) || &#39;]&#39; AS cocoa_cars&#xA;FROM passengers&#xA;JOIN json_each(passengers.favorite_mixins) as passenger_mixin&#xA;JOIN stocked_cars&#xA;JOIN json_each(stocked_cars.available_mixins) AS available_mixin&#xA;    ON passenger_mixin.value = available_mixin.value&#xA;   ...&gt; GROUP BY passengers.passenger_name;&#xA;+------------------------+------------+&#xA;|     passenger_name     | cocoa_cars |&#xA;+------------------------+------------+&#xA;| Aaron Ashcroft         | [2]        |&#xA;| Abigail Worthington    | [2,9]      |&#xA;| Aiden Stanfield        | [9]        |&#xA;| Alexander Stratton     | [5,9]      |&#xA;| Alexis Thorndale       | [9]        |&#xA;| Alice Merriweather     | [5,2,9]    |&#xA;| Amanda Fitzroy         | [5,2,9]    |&#xA;| Amelia Rosewood        | [9]        |&#xA;| Anna Westfield         | [2]        |&#xA;| Aria Blackwood         | [5,2,9]    |&#xA;| Ashley Fairhaven       | [5,2,9]    |&#xA;| August Blackwell       | [5,2,9]    |&#xA;| Aurora Whitfield       | [5,2,9]    |&#xA;| Benjamin Fairweather   | [9]        |&#xA;| Benjamin Patel         | [9]        |&#xA;| Bianca Pereira         | [9]        |&#xA;| Blake Dunwood          | [5,9]      |&#xA;| Brandon Locklear       | [5,2,9]    |&#xA;| Caleb Osei             | [5,9]      |&#xA;| Cameron Gladstone      | [2,9]      |&#xA;| Cameron Rogers         | [5,2,9]    |&#xA;| Carlos Mendez          | [5,9]      |&#xA;| Caroline Bannister     | [2,9]      |&#xA;| Carter Brookfield      | [2]        |&#xA;| Charlotte Singh        | [5,2,9]    |&#xA;| Charlotte Waverly      | [5,9]      |&#xA;| Chase Ashbury          | [5,9]      |&#xA;| Chloe Blackstone       | [5,9]      |&#xA;| Claire Berkshire       | [5,9]      |&#xA;| Clara Westbrook        | [2]        |&#xA;| Cole Ashland           | [2]        |&#xA;| Connor Redmond         | [2,9]      |&#xA;| Daniel Murphy          | [5,2]      |&#xA;| Declan Summerfield     | [2,9]      |&#xA;| Diana Brightwell       | [5,9]      |&#xA;| Diego Ramos            | [5,9]      |&#xA;| Dylan Blakely          | [5,2,9]    |&#xA;| Eleanor Morrison       | [5,9]      |&#xA;| Elena Morales          | [5,2,9]    |&#xA;| Elijah Wakefield       | [2,9]      |&#xA;| Eliza Radcliffe        | [5,9]      |&#xA;| Elizabeth Chesterfield | [5,2,9]    |&#xA;| Emily Claridge         | [5,9]      |&#xA;| Emily Johnson          | [2]        |&#xA;| Emma Gilmore           | [5,2,9]    |&#xA;| Emma Svensson          | [5,9]      |&#xA;| Ethan Marlowe          | [5,2,9]    |&#xA;| Eva Schmidt            | [5,2]      |&#xA;| Eva Templeton          | [5,2,9]    |&#xA;| Fatima Noor            | [9]        |&#xA;| Felix Schneider        | [9]        |&#xA;| Felix Whitmore         | [2]        |&#xA;| Finn Lockhart          | [2,9]      |&#xA;| Gabriel Winthrop       | [5,2,9]    |&#xA;| Grace Aldridge         | [5,2,9]    |&#xA;| Grace Thornhill        | [5,2,9]    |&#xA;| Grayson Westmore       | [2,9]      |&#xA;| Griffin Hartwell       | [9]        |&#xA;| Hannah Livingston      | [5,2,9]    |&#xA;| Hazel Kincaid          | [2,9]      |&#xA;| Henry Treadwell        | [2,9]      |&#xA;| Hiro Tanaka            | [5,9]      |&#xA;| Hunter Bellingham      | [5,9]      |&#xA;| Ian Landsman           | [2]        |&#xA;| Ingrid Nilsen          | [5,2,9]    |&#xA;| Iris Pembroke          | [5,2,9]    |&#xA;| Isaac Kendrick         | [2]        |&#xA;| Isabella Laurent       | [5,9]      |&#xA;| Isabella Rodriguez     | [9]        |&#xA;| Ivy Winslow            | [2,9]      |&#xA;| Jack Carmichael        | [9]        |&#xA;| Jackson Thorpe         | [2,9]      |&#xA;| James Garrison         | [5,9]      |&#xA;| James Kim              | [5,2,9]    |&#xA;| James Simon            | [5,2,9]    |&#xA;| Jason Stewart          | [5]        |&#xA;| Jasper Thorne          | [5,2,9]    |&#xA;| Jessica Langston       | [5,2,9]    |&#xA;| Jonah Wolfe            | [2]        |&#xA;| Jordan Waverly         | [5,9]      |&#xA;| Julia Pendleton        | [5,2,9]    |&#xA;| Julian Blackburn       | [5]        |&#xA;| Kayla Mansfield        | [5,9]      |&#xA;| Keiko Ito              | [2]        |&#xA;| Landon Whitworth       | [2,9]      |&#xA;| Laura Thornbury        | [5,2,9]    |&#xA;| Lauren Silverton       | [2,9]      |&#xA;| Layla Brooks           | [2,9]      |&#xA;| Leo Greyson            | [2,9]      |&#xA;| Liam Donovan           | [2,9]      |&#xA;| Liam OConnor           | [2,9]      |&#xA;| Linda Lee              | [5,2]      |&#xA;| Logan Sherwood         | [5,9]      |&#xA;| Lucas Ford             | [5,2]      |&#xA;| Lucas Prescott         | [9]        |&#xA;| Lucy Morris            | [2,9]      |&#xA;| Luna Hartley           | [5]        |&#xA;| Madeline Ramsey        | [2,9]      |&#xA;| Madison Clearwater     | [5,2,9]    |&#xA;| Margaret Fairbanks     | [2,9]      |&#xA;| Mason Fairmont         | [2,9]      |&#xA;| Mateo Cruz             | [5,2,9]    |&#xA;| Matthew Sutherland     | [2,9]      |&#xA;| Megan Redfield         | [5,9]      |&#xA;| Melissa Ravenscroft    | [5]        |&#xA;| Mia Chen               | [5,9]      |&#xA;| Mila Novak             | [2,9]      |&#xA;| Miles Brennan          | [5,2,9]    |&#xA;| Milo Ashford           | [2,9]      |&#xA;| Mira Zhao              | [2,9]      |&#xA;| Morgan Steadman        | [5,9]      |&#xA;| Natalie Warwick        | [2]        |&#xA;| Nathan Whitley         | [5,9]      |&#xA;| Nia Grant              | [5,9]      |&#xA;| Nicholas Montague      | [5]        |&#xA;| Nicole Ashridge        | [9]        |&#xA;| Noah Fischer           | [2,9]      |&#xA;| Nolan Murphy           | [2,9]      |&#xA;| Nolan Young            | [5,2,9]    |&#xA;| Nora Calloway          | [9]        |&#xA;| Nova Adams             | [2]        |&#xA;| Oliver Ashby           | [9]        |&#xA;| Olivia Carrington      | [2]        |&#xA;| Olivia Dubois          | [5,9]      |&#xA;| Omar Qureshi           | [2]        |&#xA;| Owen Fairchild         | [5,9]      |&#xA;| Parker Blackmore       | [5,2,9]    |&#xA;| Penelope Sinclair      | [5,2,9]    |&#xA;| Rachel Wyndham         | [5,9]      |&#xA;| Rafael Silva           | [2,9]      |&#xA;| Ravi Patel             | [5,2,9]    |&#xA;| Rebecca Ashcombe       | [2,9]      |&#xA;| Robert Smith           | [9]        |&#xA;| Rose Drummond          | [5,2,9]    |&#xA;| Ruby Hawthorne         | [2]        |&#xA;| Ryan Beckett           | [5,9]      |&#xA;| Samantha Brightwood    | [9]        |&#xA;| Samuel Kingsley        | [5,2,9]    |&#xA;| Sara Johansson         | [2]        |&#xA;| Sarah Davis            | [9]        |&#xA;| Sarah Remington        | [5,2,9]    |&#xA;| Sawyer Hollingsworth   | [9]        |&#xA;| Scarlett Dalton        | [5,2,9]    |&#xA;| Silas Merrick          | [2]        |&#xA;| Sofia Kim              | [2]        |&#xA;| Sophia Brookshire      | [2,9]      |&#xA;| Sophia Rossi           | [5,2,9]    |&#xA;| Sophie Langford        | [5,9]      |&#xA;| Stella Beaumont        | [2,9]      |&#xA;| Tariq Hassan           | [5,9]      |&#xA;| Taylor Woodridge       | [5,2,9]    |&#xA;| Theodore Hadley        | [2]        |&#xA;| Tim Cook               | [9]        |&#xA;| Tristan Ashbrook       | [9]        |&#xA;| Victoria Harwood       | [2,9]      |&#xA;| Violet Sterling        | [2,9]      |&#xA;| William Beauchamp      | [5,2,9]    |&#xA;| William Becker         | [9]        |&#xA;| William Chen           | [5,2,9]    |&#xA;| Wyatt Alderton         | [5]        |&#xA;| Yara Haddad            | [5]        |&#xA;| Youssef El-Sayed       | [2,9]      |&#xA;| Zachary Collins        | [2,9]      |&#xA;| Zachary Stafford       | [5,9]      |&#xA;| Zara Sheikh            | [9]        |&#xA;| Zoe Wilson             | [2]        |&#xA;+------------------------+------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;I think we are done.&#xA;&#xA;We did it, it was a bit different one.&#xA;&#xA;Some wired hack here and there but we made it!&#xA;&#xA;Day 7 done and dusted.&#xA;&#xA;On to the day 8.</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 6: Days of Delight</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-6</link>
      <description>Advent of SQL Day 6: Days of Delight It is day 6 of advent of SQL. Let&#39;s jump straight into the sql for the day. So, we have two tables: table table The first t</description>
      <pubDate>Sun, 21 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL Day 6: Days of Delight&#xA;&#xA;It is day 6 of advent of SQL.&#xA;&#xA;Let&#39;s jump straight into the sql for the day.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS families;&#xA;DROP TABLE IF EXISTS deliveries_assigned;&#xA;&#xA;CREATE TABLE families (&#xA;    id INT PRIMARY KEY,&#xA;    family_name TEXT&#xA;);&#xA;&#xA;CREATE TABLE deliveries_assigned (&#xA;    id INT PRIMARY KEY,&#xA;    family_id INT,&#xA;    gift_date DATE,&#xA;    gift_name TEXT&#xA;);&#xA;&#xA;INSERT INTO families (id, family_name) VALUES&#xA;    (1, &#39;Isla Martinez&#39;),&#xA;    (2, &#39;Nolan Garcia&#39;),&#xA;    (3, &#39;Yara Chen&#39;),&#xA;    (4, &#39;Tariq Nguyen&#39;),&#xA;    (5, &#39;Mila Hernandez&#39;),&#xA;    (6, &#39;Casey Kim&#39;),&#xA;    (7, &#39;Mateo Hernandez&#39;),&#xA;    (8, &#39;Keiko Petrov&#39;),&#xA;    (9, &#39;Ethan Flores&#39;),&#xA;    (10, &#39;Mateo Nakamura&#39;),&#xA;    (11, &#39;Maya Fernandez&#39;),&#xA;    (12, &#39;Mila Davis&#39;),&#xA;    (13, &#39;Yara Rossi&#39;),&#xA;    (14, &#39;Nolan Phillips&#39;),&#xA;    (15, &#39;Amina Perez&#39;);&#xA;&#xA;INSERT INTO deliveries_assigned (id, family_id, gift_date, gift_name) VALUES&#xA;    (1, 1, &#39;2025-12-01&#39;, &#39;roasted cashews&#39;),&#xA;    (2, 1, &#39;2025-12-02&#39;, &#39;cookie decorating kit&#39;),&#xA;    (3, 1, &#39;2025-12-03&#39;, &#39;dark chocolate assortment&#39;),&#xA;    (4, 1, &#39;2025-12-04&#39;, &#39;white chocolate candies&#39;),&#xA;    (5, 1, &#39;2025-12-05&#39;, &#39;reindeer headband&#39;),&#xA;    (6, 1, &#39;2025-12-06&#39;, &#39;holiday brownie bites&#39;),&#xA;    (7, 1, &#39;2025-12-07&#39;, &#39;shortbread cookie tin&#39;),&#xA;    (8, 1, &#39;2025-12-08&#39;, &#39;chocolate chip cookies&#39;),&#xA;    (9, 1, &#39;2025-12-11&#39;, &#39;holiday jam trio&#39;),&#xA;    (10, 1, &#39;2025-12-12&#39;, &#39;white chocolate popcorn&#39;),&#xA;    (11, 1, &#39;2025-12-14&#39;, &#39;holiday jam trio&#39;),&#xA;    (12, 1, &#39;2025-12-15&#39;, &#39;fudge bites&#39;),&#xA;    (13, 1, &#39;2025-12-16&#39;, &#39;holiday sticker sheet&#39;),&#xA;    (14, 1, &#39;2025-12-18&#39;, &#39;hot cocoa bombs&#39;),&#xA;    (15, 1, &#39;2025-12-19&#39;, &#39;honey roasted nuts&#39;),&#xA;    (16, 1, &#39;2025-12-20&#39;, &#39;holiday mug&#39;),&#xA;    (17, 1, &#39;2025-12-21&#39;, &#39;white chocolate candies&#39;),&#xA;    (18, 1, &#39;2025-12-22&#39;, &#39;puzzle book&#39;),&#xA;    (19, 1, &#39;2025-12-23&#39;, &#39;snowman plush&#39;),&#xA;    (20, 1, &#39;2025-12-24&#39;, &#39;scented hand cream&#39;),&#xA;    (21, 1, &#39;2025-12-25&#39;, &#39;vanilla bean wafers&#39;),&#xA;    (22, 2, &#39;2025-12-01&#39;, &#39;roasted cashews&#39;),&#xA;    (23, 2, &#39;2025-12-02&#39;, &#39;holiday brownie bites&#39;),&#xA;    (24, 2, &#39;2025-12-03&#39;, &#39;peppermint bark bites&#39;),&#xA;    (25, 2, &#39;2025-12-04&#39;, &#39;holiday jam trio&#39;),&#xA;    (26, 2, &#39;2025-12-05&#39;, &#39;festive notepad&#39;),&#xA;    (27, 2, &#39;2025-12-06&#39;, &#39;scented pine sachet&#39;),&#xA;    (28, 2, &#39;2025-12-07&#39;, &#39;holiday mug&#39;),&#xA;    (29, 2, &#39;2025-12-08&#39;, &#39;shortbread cookie tin&#39;),&#xA;    (30, 2, &#39;2025-12-09&#39;, &#39;dark chocolate assortment&#39;);&#xA;&#xA;```&#xA;&#xA;So, we have two tables:&#xA;1. `families` table&#xA;2. `deliveries_assigned` table&#xA;&#xA;The first table, `family` just has the id and the name of the family.&#xA;&#xA;The second table, `deliveries_assigned` has the id, family id, gift date, and gift name.&#xA;&#xA;Let&#39;s look at the problem statement.&#xA;&#xA;## Problem&#xA;&#xA;&gt; Generate a report that returns the dates and families that have no delivery assigned after December 14th, using the `families` and `deliveries_assigned`.&#xA;&gt; &#xA;&gt; Each row in the report should be a date and family name that represents the dates in which families don&#39;t have a delivery assigned yet.&#xA;&gt; &#xA;&gt; Label the columns as `unassigned_date` and `name`. Order the results by `unassigned_date` and `name`, respectively, both in ascending order.&#xA;&#xA;&#xA;Ok, so we need to list the deliveries_assigned for each family first, to check what kind of pattern are we looking for.&#xA;&#xA;```sql&#xA;SELECT * FROM deliveries_assigned WHERE family_id = 1;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; select * from deliveries_assigned where family_id=1;&#xA;+----+-----------+------------+---------------------------+&#xA;| id | family_id | gift_date  |         gift_name         |&#xA;+----+-----------+------------+---------------------------+&#xA;| 1  | 1         | 2025-12-01 | roasted cashews           |&#xA;| 2  | 1         | 2025-12-02 | cookie decorating kit     |&#xA;| 3  | 1         | 2025-12-03 | dark chocolate assortment |&#xA;| 4  | 1         | 2025-12-04 | white chocolate candies   |&#xA;| 5  | 1         | 2025-12-05 | reindeer headband         |&#xA;| 6  | 1         | 2025-12-06 | holiday brownie bites     |&#xA;| 7  | 1         | 2025-12-07 | shortbread cookie tin     |&#xA;| 8  | 1         | 2025-12-08 | chocolate chip cookies    |&#xA;| 9  | 1         | 2025-12-11 | holiday jam trio          |&#xA;| 10 | 1         | 2025-12-12 | white chocolate popcorn   |&#xA;| 11 | 1         | 2025-12-14 | holiday jam trio          |&#xA;| 12 | 1         | 2025-12-15 | fudge bites               |&#xA;| 13 | 1         | 2025-12-16 | holiday sticker sheet     |&#xA;| 14 | 1         | 2025-12-18 | hot cocoa bombs           |&#xA;| 15 | 1         | 2025-12-19 | honey roasted nuts        |&#xA;| 16 | 1         | 2025-12-20 | holiday mug               |&#xA;| 17 | 1         | 2025-12-21 | white chocolate candies   |&#xA;| 18 | 1         | 2025-12-22 | puzzle book               |&#xA;| 19 | 1         | 2025-12-23 | snowman plush             |&#xA;| 20 | 1         | 2025-12-24 | scented hand cream        |&#xA;| 21 | 1         | 2025-12-25 | vanilla bean wafers       |&#xA;+----+-----------+------------+---------------------------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;So, we are missing gifts for family id `1` on `09`, `10`, `13`, and `17`. But we are only asked for gifts after `14` (December 14th).&#xA;&#xA;&gt; Generate a report that returns the dates and families that have **no delivery assigned after December 14th**, using the families and deliveries_assigned.&#xA;&#xA;So, we can discard `09`, `10`, and `13` as they are before December 14th.&#xA;&#xA;```sql&#xA;SELECT * FROM deliveries_assigned WHERE family_id = 1 AND gift_date &gt; &#39;2025-12-14&#39;;&#xA;```&#xA;&#xA;It gives the right dates where the gifts are assigned after December 14th. But the problem is we need to get dates which are missing in this `deliveries_assigned` table record for each family.&#xA;&#xA;Finding something missing is kind of wired, because you don&#39;t have what is missing. Especially for dates, like dates are very painful.&#xA;&#xA;We need to find among the sequential order of the dates, when some of the dates are missing, that is simple here, but you can see it could be quite cubersome if we have to manually add each date in the list for comparing with.&#xA;&#xA;### JOINs with NOT EXISTS&#xA;&#xA;So, the basic dirty solution is to check the missing dates for each family, one by one.&#xA;&#xA;```sql&#xA;SELECT &#xA;    families.family_name AS name,&#xA;    dates.column1 AS unassigned_date&#xA;FROM families&#xA;JOIN (&#xA;    VALUES &#xA;        (&#39;2025-12-15&#39;), (&#39;2025-12-16&#39;), (&#39;2025-12-17&#39;),&#xA;        (&#39;2025-12-18&#39;), (&#39;2025-12-19&#39;), (&#39;2025-12-20&#39;),&#xA;        (&#39;2025-12-21&#39;), (&#39;2025-12-22&#39;), (&#39;2025-12-23&#39;),&#xA;        (&#39;2025-12-24&#39;), (&#39;2025-12-25&#39;)&#xA;) AS dates ON 1=1&#xA;WHERE NOT EXISTS (&#xA;    SELECT 1 &#xA;    FROM deliveries_assigned&#xA;    WHERE deliveries_assigned.family_id = families.id &#xA;    AND deliveries_assigned.gift_date = dates.column1&#xA;)&#xA;ORDER BY unassigned_date, name;&#xA;```&#xA;&#xA;Let&#39;s break it down:&#xA;1. We have a `families` table with id and family name.&#xA;2. We have a `deliveries_assigned` table with id, family id, gift date, and gift name.&#xA;3. We create a list of dates from `2025-12-15` to `2025-12-25` using the `VALUES` keyword.&#xA;   - This just appends one date after other and names the columns as `column1` with the table as `dates`.&#xA;   - The `ON` condition is `1=1` to make sure the `WHERE` condition is true for the JOIN to happen.&#xA;4. We use the `NOT EXISTS` keyword to check if the `deliveries_assigned` table has a record for each date in the list.&#xA;   - we use `NOT EXISTS` because we want to check if there is no date for each date in the assigned list.&#xA;5. We order the results by `unassigned_date` and `name`, respectively, both in ascending order.&#xA;&#xA;So, that is not the best way to solve this, I think.&#xA;&#xA;### Recursive CTEs&#xA;&#xA;We can generate a table full of dates, and then cross join the table with the `families` table. This will give us all possible combinations of dates and families like the gift should ideally be there for each family from 1st to 25th of December(however we are only interested from 15th to 25th December). Then once we have that full table of combination, we can check from the `deliveries_assigned` table and inner join for each family and filter out the rows which have family_id as `NULL` because some dates will be missing for that family.&#xA;&#xA;First we&#39;ll create a recursive table of dates from `2025-12-15` to `2025-12-25`&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;)&#xA;SELECT * FROM dates;&#xA;```&#xA;&#xA;This will give us the dates from `2025-12-15` to `2025-12-25` in a recursive table.&#xA;&#xA;What is a recursive table?&#xA;&#xA;&gt; A recursive table is a table that is defined as a combination of itself.&#xA;&#xA;So, we can create a recursive table of dates from `2025-12-15` to `2025-12-25`, The base case is `2025-12-15`, and the recursive case is `SELECT date(gift_date, &#39;+1 day&#39;) FROM dates WHERE gift_date &lt; &#39;2025-12-25&#39;` which means it will call it recursively for `2025-12-16` on the first call inside it because of `+1 day` as the interval in the `date` function.&#xA;&#xA;The date function is a [function](https://sqlite.org/lang_datefunc.html#modifiers) which takes in a date and we can add modifiers to it to manipulate or extract parts of the date. Here we have added the modifier as `+1 day` which will increment the day by one. We then call that in the `dates` CTE again till we have the `date` as less then `2025-12-25`. Till then we will have created all the `dates` from `15` till `25` including `25`. We can just type in the values manually as we did in the first dirty solution, but I wanted to see how we can generate dates dynamically in sqlite.&#xA;&#xA;```&#xA;sqlite&gt; WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;)&#xA;SELECT * FROM dates;&#xA;+------------+&#xA;| gift_date  |&#xA;+------------+&#xA;| 2025-12-15 |&#xA;| 2025-12-16 |&#xA;| 2025-12-17 |&#xA;| 2025-12-18 |&#xA;| 2025-12-19 |&#xA;| 2025-12-20 |&#xA;| 2025-12-21 |&#xA;| 2025-12-22 |&#xA;| 2025-12-23 |&#xA;| 2025-12-24 |&#xA;| 2025-12-25 |&#xA;+------------+&#xA;Run Time: real 0.000 user 0.000225 sys 0.000009&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Now we&#39;ll cross join the `families` table with the `dates` table.&#xA;&#xA;```sql&#xA;SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;FROM families&#xA;CROSS JOIN dates;&#xA;```&#xA;&#xA;OK! Wait include the `dates` CTE above too, was just simplifying the query.&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;)&#xA;SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;FROM families&#xA;CROSS JOIN dates;&#xA;```&#xA;This will give us the full table of combination of dates and families.&#xA;&#xA;```sql&#xA;SELECT COUNT(*) FROM families;&#xA;```&#xA;&#xA;```sql&#xA;sqlite&gt; SELECT COUNT(*) FROM families;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 250      |&#xA;+----------+&#xA;Run Time: real 0.000 user 0.000113 sys 0.000006&#xA;sqlite&gt;&#xA;```&#xA;&#xA;That is :&#xA;&#xA;- There are 11 dates right? `2025-12-15` to `2025-12-25`, -&gt; `15` (1), `16` (2), `17` (3), `18` (4), `19` (5), `20` (6), `21` (7), `22` (8), `23` (9), `24` (10), `25` (11)&#xA;- There are I think `250` families.&#xA;- So, a `CROSS JOIN` will give us `11 * 250 = 2750` rows.&#xA;&#xA;&#xA;```sql&#xA;SELECT COUNT(*) FROM (WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;)&#xA;SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;FROM families&#xA;CROSS JOIN dates) as count;&#xA;```&#xA;&#xA;11 dates for each family.&#xA;&#xA;```&#xA;sqlite&gt; SELECT COUNT(*) FROM (WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;)&#xA;SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;FROM families&#xA;CROSS JOIN dates) as count;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 2750     |&#xA;+----------+&#xA;Run Time: real 0.001 user 0.000407 sys 0.000000&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Now, we simply have to join the `deliveries_assigned` table with the above table.&#xA;&#xA;Why?&#xA;Because we need to map which dates are assigned and which dates are missing.&#xA;&#xA;We need to do a which join?&#xA;&#xA;LEFT, RIGHT or INNER&#xA;&#xA;LEFT&#xA;&#xA;WHy?&#xA;&#xA;Because, the left table will have all the dates, and the right table will have the assigned dates. **We need all the records from the `left` table (the combination, cross join table)**&#xA;&#xA;We need all the rows in the `left` or the combination(cross join) of date and family table as to map which dates are assigned and which dates are missing.&#xA;&#xA;If the `left` table has a record, then it means the date is assigned. If the `right` table has a record i.e. the gift is assigned, then it means the date is not assigned.&#xA;&#xA;Hence, we can simply then filter out the relations with `NULL` as the `family_id` in the `deliveries_assigned` table.&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date LIMIT 30;&#xA;+-----------+---------------+------------+----+-----------+------------+-------------------------+&#xA;| family_id |  family_name  | gift_date  | id | family_id | gift_date  |        gift_name        |&#xA;+-----------+---------------+------------+----+-----------+------------+-------------------------+&#xA;| 1         | Isla Martinez | 2025-12-15 | 12 | 1         | 2025-12-15 | fudge bites             |&#xA;| 1         | Isla Martinez | 2025-12-16 | 13 | 1         | 2025-12-16 | holiday sticker sheet   |&#xA;| 1         | Isla Martinez | 2025-12-17 |    |           |            |                         |&#xA;| 1         | Isla Martinez | 2025-12-18 | 14 | 1         | 2025-12-18 | hot cocoa bombs         |&#xA;| 1         | Isla Martinez | 2025-12-19 | 15 | 1         | 2025-12-19 | honey roasted nuts      |&#xA;| 1         | Isla Martinez | 2025-12-20 | 16 | 1         | 2025-12-20 | holiday mug             |&#xA;| 1         | Isla Martinez | 2025-12-21 | 17 | 1         | 2025-12-21 | white chocolate candies |&#xA;| 1         | Isla Martinez | 2025-12-22 | 18 | 1         | 2025-12-22 | puzzle book             |&#xA;| 1         | Isla Martinez | 2025-12-23 | 19 | 1         | 2025-12-23 | snowman plush           |&#xA;| 1         | Isla Martinez | 2025-12-24 | 20 | 1         | 2025-12-24 | scented hand cream      |&#xA;| 1         | Isla Martinez | 2025-12-25 | 21 | 1         | 2025-12-25 | vanilla bean wafers     |&#xA;| 2         | Nolan Garcia  | 2025-12-15 | 36 | 2         | 2025-12-15 | mini marshmallow tubes  |&#xA;| 2         | Nolan Garcia  | 2025-12-16 | 37 | 2         | 2025-12-16 | white chocolate candies |&#xA;| 2         | Nolan Garcia  | 2025-12-17 | 38 | 2         | 2025-12-17 | gingerbread cookie kit  |&#xA;| 2         | Nolan Garcia  | 2025-12-18 | 39 | 2         | 2025-12-18 | family card game        |&#xA;| 2         | Nolan Garcia  | 2025-12-19 |    |           |            |                         |&#xA;| 2         | Nolan Garcia  | 2025-12-20 | 40 | 2         | 2025-12-20 | santa hat               |&#xA;| 2         | Nolan Garcia  | 2025-12-21 | 41 | 2         | 2025-12-21 | holiday sticker sheet   |&#xA;| 2         | Nolan Garcia  | 2025-12-22 |    |           |            |                         |&#xA;| 2         | Nolan Garcia  | 2025-12-23 | 42 | 2         | 2025-12-23 | pecan praline bites     |&#xA;| 2         | Nolan Garcia  | 2025-12-24 |    |           |            |                         |&#xA;| 2         | Nolan Garcia  | 2025-12-25 | 43 | 2         | 2025-12-25 | santa hat               |&#xA;| 3         | Yara Chen     | 2025-12-15 | 57 | 3         | 2025-12-15 | peppermint bark bites   |&#xA;| 3         | Yara Chen     | 2025-12-16 |    |           |            |                         |&#xA;| 3         | Yara Chen     | 2025-12-17 |    |           |            |                         |&#xA;| 3         | Yara Chen     | 2025-12-18 | 58 | 3         | 2025-12-18 | cheddar popcorn         |&#xA;| 3         | Yara Chen     | 2025-12-19 |    |           |            |                         |&#xA;| 3         | Yara Chen     | 2025-12-20 | 59 | 3         | 2025-12-20 | festive notepad         |&#xA;| 3         | Yara Chen     | 2025-12-21 | 60 | 3         | 2025-12-21 | fruit assortment        |&#xA;| 3         | Yara Chen     | 2025-12-22 |    |           |            |                         |&#xA;+-----------+---------------+------------+----+-----------+------------+-------------------------+&#xA;...&#xA;...&#xA;| 249       | Jude Bautista     | 2025-12-24 |      |           |            |                            |&#xA;| 249       | Jude Bautista     | 2025-12-25 | 5073 | 249       | 2025-12-25 | almond brittle             |&#xA;| 250       | Bianca Muller     | 2025-12-15 | 5086 | 250       | 2025-12-15 | cocoa mix bundle           |&#xA;| 250       | Bianca Muller     | 2025-12-16 | 5087 | 250       | 2025-12-16 | cookie decorating kit      |&#xA;| 250       | Bianca Muller     | 2025-12-17 | 5088 | 250       | 2025-12-17 | shortbread cookie tin      |&#xA;| 250       | Bianca Muller     | 2025-12-18 |      |           |            |                            |&#xA;| 250       | Bianca Muller     | 2025-12-19 | 5089 | 250       | 2025-12-19 | snowflake candle           |&#xA;| 250       | Bianca Muller     | 2025-12-20 |      |           |            |                            |&#xA;| 250       | Bianca Muller     | 2025-12-21 | 5090 | 250       | 2025-12-21 | trail mix trio             |&#xA;| 250       | Bianca Muller     | 2025-12-22 | 5091 | 250       | 2025-12-22 | shortbread cookie tin      |&#xA;| 250       | Bianca Muller     | 2025-12-23 |      |           |            |                            |&#xA;| 250       | Bianca Muller     | 2025-12-24 |      |           |            |                            |&#xA;| 250       | Bianca Muller     | 2025-12-25 | 5092 | 250       | 2025-12-25 | gingerbread cookie kit     |&#xA;+-----------+-------------------+------------+------+-----------+------------+----------------------------+&#xA;Run Time: real 0.028 user 0.013392 sys 0.013096&#xA;sqlite&gt;&#xA;```&#xA;&#xA;This will give all the gifts that have been assigned as well as not assigned for each family on each date from `2025-12-15` to `2025-12-25`.&#xA;&#xA;Now, you can see, what we need and what we don&#39;t we simply can get the gifts or records with the `family_id` in the `deliveries_assigned` table as `NULL`, since there was no record for the family_id in the `deliveries_assigned` table for that date.&#xA;&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date&#xA;WHERE deliveries_assigned.family_id IS NULL&#xA;```&#xA;So, a simple `WHERE` clause with deliveries_assigned.family_id `IS NULL` will give us the missing dates for each family.&#xA;&#xA;```&#xA;sqlite&gt; WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    *&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date&#xA;WHERE deliveries_assigned.family_id IS NULL&#xA;   ...&gt; ;&#xA;+-----------+-------------------+------------+----+-----------+-----------+-----------+&#xA;| family_id |    family_name    | gift_date  | id | family_id | gift_date | gift_name |&#xA;+-----------+-------------------+------------+----+-----------+-----------+-----------+&#xA;| 1         | Isla Martinez     | 2025-12-17 |    |           |           |           |&#xA;| 2         | Nolan Garcia      | 2025-12-19 |    |           |           |           |&#xA;| 2         | Nolan Garcia      | 2025-12-22 |    |           |           |           |&#xA;| 2         | Nolan Garcia      | 2025-12-24 |    |           |           |           |&#xA;| 3         | Yara Chen         | 2025-12-16 |    |           |           |           |&#xA;| 3         | Yara Chen         | 2025-12-17 |    |           |           |           |&#xA;| 3         | Yara Chen         | 2025-12-19 |    |           |           |           |&#xA;| 3         | Yara Chen         | 2025-12-22 |    |           |           |           |&#xA;| 4         | Tariq Nguyen      | 2025-12-16 |    |           |           |           |&#xA;...&#xA;...&#xA;| 247       | Malik Kim         | 2025-12-18 |    |           |           |           |&#xA;| 247       | Malik Kim         | 2025-12-20 |    |           |           |           |&#xA;| 247       | Malik Kim         | 2025-12-21 |    |           |           |           |&#xA;| 248       | Tariq Flores      | 2025-12-16 |    |           |           |           |&#xA;| 248       | Tariq Flores      | 2025-12-17 |    |           |           |           |&#xA;| 248       | Tariq Flores      | 2025-12-18 |    |           |           |           |&#xA;| 249       | Jude Bautista     | 2025-12-16 |    |           |           |           |&#xA;| 249       | Jude Bautista     | 2025-12-20 |    |           |           |           |&#xA;| 249       | Jude Bautista     | 2025-12-21 |    |           |           |           |&#xA;| 249       | Jude Bautista     | 2025-12-23 |    |           |           |           |&#xA;| 249       | Jude Bautista     | 2025-12-24 |    |           |           |           |&#xA;| 250       | Bianca Muller     | 2025-12-18 |    |           |           |           |&#xA;| 250       | Bianca Muller     | 2025-12-20 |    |           |           |           |&#xA;| 250       | Bianca Muller     | 2025-12-23 |    |           |           |           |&#xA;| 250       | Bianca Muller     | 2025-12-24 |    |           |           |           |&#xA;+-----------+-------------------+------------+----+-----------+-----------+-----------+&#xA;Run Time: real 0.014 user 0.011508 sys 0.002046&#xA;sqlite&gt;&#xA;```&#xA;&#xA;All right, we now need to order by `unassigned_date` and `name` which are the `dates` from the `combination` table and the `family_name` from the `families` table.&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    combination.gift_date as unassigned_date,&#xA;    family_name&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date&#xA;WHERE deliveries_assigned.family_id IS NULL&#xA;ORDER BY unassigned_date, family_name;&#xA;```&#xA;&#xA;```&#xA;+-----------------+-------------------+&#xA;| unassigned_date |    family_name    |&#xA;+-----------------+-------------------+&#xA;| 2025-12-15      | Adil Rossi        |&#xA;| 2025-12-15      | Aisha Connor      |&#xA;| 2025-12-15      | Amina Perez       |&#xA;| 2025-12-15      | Amina Wong        |&#xA;| 2025-12-15      | Andre Flores      |&#xA;| 2025-12-15      | Anya Singh        |&#xA;| 2025-12-15      | Arjun Wong        |&#xA;| 2025-12-15      | Bianca Connor     |&#xA;| 2025-12-15      | Caleb Petrov      |&#xA;| 2025-12-15      | Caleb Roberts     |&#xA;| 2025-12-15      | Carmen Carter     |&#xA;| 2025-12-15      | Carmen Garcia     |&#xA;| 2025-12-15      | Casey Flores      |&#xA;| 2025-12-15      | Chi Hughes        |&#xA;| 2025-12-15      | Clara Johnson     |&#xA;| 2025-12-15      | Dara Bautista     |&#xA;| 2025-12-15      | David Ramirez     |&#xA;| 2025-12-15      | Elias Petrov      |&#xA;| 2025-12-15      | Elias Petrov      |&#xA;| 2025-12-15      | Ethan Flores      |&#xA;| 2025-12-15      | Eva Gonzalez      |&#xA;...&#xA;...&#xA;| 2025-12-25      | Owen Park         |&#xA;| 2025-12-25      | Priya Khan        |&#xA;| 2025-12-25      | Rafael Singh      |&#xA;| 2025-12-25      | Ravi Abdallah     |&#xA;| 2025-12-25      | Ravi Mitchell     |&#xA;| 2025-12-25      | Rosa Turner       |&#xA;| 2025-12-25      | Sara Jensen       |&#xA;| 2025-12-25      | Sara Lopez        |&#xA;| 2025-12-25      | Sara Rossi        |&#xA;| 2025-12-25      | Sarah Phillips    |&#xA;| 2025-12-25      | Seth Garcia       |&#xA;| 2025-12-25      | Sienna Lopez      |&#xA;| 2025-12-25      | Sofia Nakamura    |&#xA;| 2025-12-25      | Tariq Nguyen      |&#xA;| 2025-12-25      | Uma Ali           |&#xA;| 2025-12-25      | Uma Phillips      |&#xA;| 2025-12-25      | Yara Chen         |&#xA;| 2025-12-25      | Yara Rossi        |&#xA;| 2025-12-25      | Yusuf Ali         |&#xA;| 2025-12-25      | Yusuf Hansen      |&#xA;| 2025-12-25      | Yusuf Perez       |&#xA;| 2025-12-25      | Yusuf Rossi       |&#xA;| 2025-12-25      | Zara Khan         |&#xA;+-----------------+-------------------+&#xA;Run Time: real 0.008 user 0.004613 sys 0.002845&#xA;sqlite&gt;&#xA;```&#xA;&#xA;Phew!&#xA;Ok, that looks a mamoth query.&#xA;&#xA;```sql&#xA;WITH RECURSIVE dates(gift_date) AS (&#xA;    SELECT &#39;2025-12-15&#39;&#xA;    UNION ALL&#xA;    SELECT date(gift_date, &#39;+1 day&#39;)&#xA;    FROM dates&#xA;    WHERE gift_date &lt; &#39;2025-12-25&#39;&#xA;),&#xA;combination AS (&#xA;    SELECT families.id AS family_id, families.family_name as family_name, dates.gift_date as gift_date&#xA;    FROM families&#xA;    CROSS JOIN dates&#xA;)&#xA;SELECT&#xA;    combination.gift_date as unassigned_date,&#xA;    family_name&#xA;FROM combination&#xA;LEFT JOIN deliveries_assigned&#xA;    ON deliveries_assigned.family_id = combination.family_id&#xA;    AND deliveries_assigned.gift_date = combination.gift_date&#xA;WHERE deliveries_assigned.family_id IS NULL&#xA;ORDER BY unassigned_date, family_name;&#xA;```&#xA;&#xA;So, to recap&#xA;&#xA;- Generate a table of dates from `2025-12-15` to `2025-12-25` using a recursive CTE.&#xA;- Create a table of combinations of `families` and `dates` using a cross join.&#xA;- Left join the `deliveries_assigned` table with the combination table.&#xA;- Filter out the rows where the `family_id` in the `deliveries_assigned` table is `NULL`.&#xA;- Order the results by `unassigned_date` and `name`, respectively, both in ascending order.&#xA;&#xA;Simple right?&#xA;&#xA;That&#39;s it from day 6. &#xA;&#xA;It getting serious out there!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 5: EchoTrack Wrapped</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-5</link>
      <description>Advent of SQL Day 5 - EchoTrack Wrapped It is day 5 of advent of SQL. Let&#39;s get rollin. It looks like a good problem. I am excited! Here&#39;s the SQL to get starte</description>
      <pubDate>Sat, 20 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL Day 5 - EchoTrack Wrapped&#xA;&#xA;It is day 5 of advent of SQL.&#xA;&#xA;Let&#39;s get rollin. It looks like a good problem. I am excited!&#xA;&#xA;Here&#39;s the SQL to get started.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS listening_logs;&#xA;&#xA;CREATE TABLE listening_logs (&#xA;    id INTEGER PRIMARY KEY,&#xA;    user_name TEXT,&#xA;    artist TEXT,&#xA;    played_at TIMESTAMP,&#xA;    content_type TEXT&#xA;);&#xA;&#xA;INSERT INTO listening_logs (id, user_name, artist, played_at, content_type) VALUES&#xA;    (1, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-04-08 00:21:53&#39;, &#39;song&#39;),&#xA;    (2, &#39;Zoe Garcia&#39;, &#39;Huberman Lab&#39;, &#39;2025-11-10 19:18:47&#39;, &#39;podcast&#39;),&#xA;    (3, &#39;Zoe Garcia&#39;, &#39;Huberman Lab&#39;, &#39;2025-01-20 15:31:02&#39;, &#39;podcast&#39;),&#xA;    (4, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-01-06 17:33:11&#39;, &#39;song&#39;),&#xA;    (5, &#39;Zoe Garcia&#39;, &#39;Candace&#39;, &#39;2025-03-06 14:07:54&#39;, &#39;podcast&#39;),&#xA;    (6, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-06-05 17:57:59&#39;, &#39;song&#39;),&#xA;    (7, &#39;Zoe Garcia&#39;, &#39;Huberman Lab&#39;, &#39;2025-01-01 20:05:22&#39;, &#39;podcast&#39;),&#xA;    (8, &#39;Zoe Garcia&#39;, &#39;Huberman Lab&#39;, &#39;2025-11-01 12:04:03&#39;, &#39;podcast&#39;),&#xA;    (9, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-09-28 12:42:12&#39;, &#39;song&#39;),&#xA;    (10, &#39;Zoe Garcia&#39;, &#39;The Ben Shapiro Show&#39;, &#39;2025-09-15 01:05:15&#39;, &#39;podcast&#39;),&#xA;    (11, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-04-26 05:31:02&#39;, &#39;song&#39;),&#xA;    (12, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-10-13 17:34:03&#39;, &#39;song&#39;),&#xA;    (13, &#39;Zoe Garcia&#39;, &#39;Mariah Carey&#39;, &#39;2025-01-20 11:21:37&#39;, &#39;song&#39;),&#xA;    (14, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-11-28 03:55:31&#39;, &#39;song&#39;),&#xA;    (15, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-07-17 05:18:16&#39;, &#39;song&#39;),&#xA;    (16, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-08-20 02:07:45&#39;, &#39;song&#39;),&#xA;    (17, &#39;Zoe Garcia&#39;, &#39;Kendrick Lamar&#39;, &#39;2025-02-16 13:25:27&#39;, &#39;song&#39;),&#xA;    (18, &#39;Zoe Garcia&#39;, &#39;Huberman Lab&#39;, &#39;2025-08-13 19:55:00&#39;, &#39;podcast&#39;),&#xA;    (19, &#39;Zoe Garcia&#39;, &#39;Bruno Mars&#39;, &#39;2025-09-13 07:09:43&#39;, &#39;song&#39;),&#xA;    (20, &#39;Zoe Garcia&#39;, &#39;Arijit Singh&#39;, &#39;2025-04-12 06:30:44&#39;, &#39;song&#39;);&#xA;```&#xA;&#xA;Let&#39;s open a SQLite shell and get started.&#xA;&#xA;```&#xA;$ sqlite3&#xA;SQLite version 3.50.4 2025-07-30 19:33:53&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day5-inserts.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE listening_logs (&#xA;    id INTEGER PRIMARY KEY,&#xA;    user_name TEXT,&#xA;    artist TEXT,&#xA;    played_at TIMESTAMP,&#xA;    content_type TEXT&#xA;);&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; SELECT * FROM listening_logs LIMIT 20;&#xA;+----+------------+----------------------+---------------------+--------------+&#xA;| id | user_name  |        artist        |      played_at      | content_type |&#xA;+----+------------+----------------------+---------------------+--------------+&#xA;| 1  | Zoe Garcia | Arijit Singh         | 2025-04-08 00:21:53 | song         |&#xA;| 2  | Zoe Garcia | Huberman Lab         | 2025-11-10 19:18:47 | podcast      |&#xA;| 3  | Zoe Garcia | Huberman Lab         | 2025-01-20 15:31:02 | podcast      |&#xA;| 4  | Zoe Garcia | Arijit Singh         | 2025-01-06 17:33:11 | song         |&#xA;| 5  | Zoe Garcia | Candace              | 2025-03-06 14:07:54 | podcast      |&#xA;| 6  | Zoe Garcia | Arijit Singh         | 2025-06-05 17:57:59 | song         |&#xA;| 7  | Zoe Garcia | Huberman Lab         | 2025-01-01 20:05:22 | podcast      |&#xA;| 8  | Zoe Garcia | Huberman Lab         | 2025-11-01 12:04:03 | podcast      |&#xA;| 9  | Zoe Garcia | Arijit Singh         | 2025-09-28 12:42:12 | song         |&#xA;| 10 | Zoe Garcia | The Ben Shapiro Show | 2025-09-15 01:05:15 | podcast      |&#xA;| 11 | Zoe Garcia | Arijit Singh         | 2025-04-26 05:31:02 | song         |&#xA;| 12 | Zoe Garcia | Arijit Singh         | 2025-10-13 17:34:03 | song         |&#xA;| 13 | Zoe Garcia | Mariah Carey         | 2025-01-20 11:21:37 | song         |&#xA;| 14 | Zoe Garcia | Arijit Singh         | 2025-11-28 03:55:31 | song         |&#xA;| 15 | Zoe Garcia | Arijit Singh         | 2025-07-17 05:18:16 | song         |&#xA;| 16 | Zoe Garcia | Arijit Singh         | 2025-08-20 02:07:45 | song         |&#xA;| 17 | Zoe Garcia | Kendrick Lamar       | 2025-02-16 13:25:27 | song         |&#xA;| 18 | Zoe Garcia | Huberman Lab         | 2025-08-13 19:55:00 | podcast      |&#xA;| 19 | Zoe Garcia | Bruno Mars           | 2025-09-13 07:09:43 | song         |&#xA;| 20 | Zoe Garcia | Arijit Singh         | 2025-04-12 06:30:44 | song         |&#xA;+----+------------+----------------------+---------------------+--------------+&#xA;sqlite&gt;&#xA;sqlite&gt; SELECT COUNT(*) FROM listening_logs;&#xA;+----------+&#xA;| COUNT(*) |&#xA;+----------+&#xA;| 18174    |&#xA;+----------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;OK! We have around 18k records in a single table! That&#39;s a lot but not not much!&#xA;&#xA;Let&#39;s see what we have to do&#xA;&#xA;## Problem&#xA;&#xA;&gt; Write a query that returns the top 3 artists per user. Order the results by the most played&#xA;&#xA;Alright, this is quite a problem to solve, if you are thinking, it easy peasy, then hold on!&#xA;&#xA;We clearly see, that we have 2 columns of our interest.&#xA;1. `user_name`&#xA;2. `artist`&#xA;&#xA;We need to group for each user his most played artists, and we need to rank those top 3 artist per user.&#xA;&#xA;And each entry is a song or podcast that the user has litened to.&#xA;&#xA;We need to aggregate, group, and then rank, and then what?&#xA;&#xA;How would you chunk out the top three?&#xA;It&#39;s time to put your SQL glasses and gloves on, it&#39;s getting colder!&#xA;&#xA;### Counting Artists Per User&#xA;&#xA;Let&#39;s take one step at a time, we need to first count how many times each artist has been played for that user.&#xA;&#xA;```sql&#xA;SELECT&#xA;    user_name,&#xA;    artist,&#xA;    COUNT(*) AS play_count&#xA;FROM listening_logs&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;```&#xA;OK, simple right?&#xA;&#xA;Select all the usernames, artist and then group by the username and the artist and count the number of times the user had played that artist. Simply then order by username, the playcount and the artist (if there is a tie in count, I think we can choose the artist name as the breaker)&#xA;&#xA;But it gives all the artists, not just the top 3, it ranks them in the decreasing order of the number of plays, but we only want to list the top 3 per user.&#xA;&#xA;That&#39;s tricky!&#xA;&#xA;### With SELF JOIN&#xA;&#xA;What if we can join the table with itself?&#xA;&#xA;Then we can compare the number of times the user has played the artist, with the number of times the user has played another artist. Then we can remove the artist if it has played less than 3 times, this then will filter out the top 3 for us.&#xA;&#xA;```sql&#xA;SELECT &#xA;    a.user_name,&#xA;    a.artist AS current_artist,&#xA;    a.play_count AS current_plays,&#xA;    b.artist AS other_artist,&#xA;    b.play_count AS other_plays&#xA;FROM (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) a&#xA;LEFT JOIN (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) b ON a.user_name = b.user_name&#xA;ORDER BY a.user_name, a.play_count DESC, b.play_count DESC;&#xA;```&#xA;&#xA;This would create a cross join of sorts between the same table.&#xA;&#xA;- Select the required columns (user_name, artist and count)&#xA;Look at this part&#xA;This is `a`&#xA;```sql&#xA;-- a&#xA;SELECT user_name, artist, COUNT(*) AS play_count&#xA;FROM listening_logs&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist&#xA;```&#xA;&#xA;Then we need to join this with itself&#xA;&#xA;- Select the required columns from the same table (user_name, artist and count)&#xA;This is `b`&#xA;```sql&#xA;-- b&#xA;SELECT user_name, artist, COUNT(*) AS play_count&#xA;FROM listening_logs&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist&#xA;```&#xA;Both `a` and `b` are the same, just that we want a cross join of sorts.&#xA;&#xA;And then we need to join `a` and `b`&#xA;- Join with itself on the user_name&#xA;- Order by user_name, play_count desc, artist&#xA;&#xA;```sql&#xA;SELECT &#xA;    a.user_name,&#xA;    a.artist AS current_artist,&#xA;    a.play_count AS current_plays,&#xA;    b.artist AS other_artist,&#xA;    b.play_count AS other_plays&#xA;FROM (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) a&#xA;LEFT JOIN (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) b ON a.user_name = b.user_name&#xA;ORDER BY a.user_name, a.play_count DESC, b.play_count DESC;&#xA;```&#xA;&#xA;Now, we want to keep the rows where the `b` table has play count greater than `a` table, or if they are equal, then if the `b` table has artist less than `a` table.&#xA;&#xA;To do that we can continue the `JOIN` condition with `AND` and add `b.play_count &gt; a.play_count` and `b.artist &lt; a.artist` in case of a tie.&#xA;The idea here is subtle:&#xA;- For a given artist `a`, we count how many artists `b` (for the same user) have more plays, or the same plays but come earlier alphabetically.&#xA;- If fewer than 3 artists beat `a`, then `a` must be in the top 3.&#xA;&#xA;So the query becomes this:&#xA;&#xA;```sql&#xA;SELECT &#xA;    a.user_name,&#xA;    a.artist,&#xA;    a.play_count&#xA;FROM (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) a&#xA;LEFT JOIN (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) b ON a.user_name = b.user_name&#xA;   AND (&#xA;       b.play_count &gt; a.play_count&#xA;       OR (b.play_count = a.play_count AND b.artist &lt; a.artist)&#xA;   )&#xA;GROUP BY a.user_name, a.artist, a.play_count&#xA;HAVING COUNT(b.artist) &lt; 3&#xA;ORDER BY a.user_name, a.play_count DESC, a.artist;&#xA;```&#xA;```&#xA;sqlite&gt; SELECT&#xA;    a.user_name,&#xA;    a.artist,&#xA;    a.play_count&#xA;FROM (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) a&#xA;LEFT JOIN (&#xA;    SELECT user_name, artist, COUNT(*) AS play_count&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) b ON a.user_name = b.user_name&#xA;   AND (&#xA;       b.play_count &gt; a.play_count&#xA;       OR (b.play_count = a.play_count AND b.artist &lt; a.artist)&#xA;   )&#xA;GROUP BY a.user_name, a.artist, a.play_count&#xA;HAVING COUNT(b.artist) &lt; 3&#xA;ORDER BY a.user_name, a.play_count DESC, a.artist;&#xA;+-------------------+--------------------------------------------+------------+&#xA;|     user_name     |                   artist                   | play_count |&#xA;+-------------------+--------------------------------------------+------------+&#xA;| Abigail Hernandez | Ed Sheeran                                 | 78         |&#xA;| Abigail Hernandez | Rotten Mango                               | 15         |&#xA;| Abigail Hernandez | Billie Eilish                              | 4          |&#xA;| Adrian Cox        | Kendrick Lamar                             | 128        |&#xA;| Adrian Cox        | Stuff You Should Know                      | 30         |&#xA;| Adrian Cox        | Fuerza Regida                              | 6          |&#xA;| Alex Rivera       | Ed Sheeran                                 | 274        |&#xA;| Alex Rivera       | Call Her Daddy (Alex Cooper)               | 42         |&#xA;| Alex Rivera       | Green Day                                  | 11         |&#xA;| Anders Nilsson    | Snow Patrol                                | 101        |&#xA;| Anders Nilsson    | SmartLess                                  | 29         |&#xA;| Anders Nilsson    | Blink-182                                  | 5          |&#xA;| Anthony King      | Pentatonix                                 | 114        |&#xA;| Anthony King      | The Tucker Carlson Show                    | 14         |&#xA;| Anthony King      | Angels &amp; Airwaves                          | 5          |&#xA;...&#xA;...&#xA;| Zara Sheikh       | Green Day                                  | 138        |&#xA;| Zara Sheikh       | This Past Weekend w Theo Von               | 20         |&#xA;| Zara Sheikh       | The Beatles                                | 7          |&#xA;| Zoe Garcia        | Arijit Singh                               | 50         |&#xA;| Zoe Garcia        | Huberman Lab                               | 14         |&#xA;| Zoe Garcia        | Kendrick Lamar                             | 5          |&#xA;| Zoe Wilson        | Pentatonix                                 | 65         |&#xA;| Zoe Wilson        | The Mel Robbins Podcast                    | 14         |&#xA;| Zoe Wilson        | Angels &amp; Airwaves                          | 4          |&#xA;| Zuri Okafor       | Kendrick Lamar                             | 96         |&#xA;| Zuri Okafor       | The Tim Dillon Show                        | 14         |&#xA;| Zuri Okafor       | Ed Sheeran                                 | 5          |&#xA;+-------------------+--------------------------------------------+------------+&#xA;Run Time: real 0.088 user 0.082510 sys 0.003999&#xA;```&#xA;This is the final query, it looks long, it might be not the best way to do it, but its definitely not the worst way to do it.&#xA;&#xA;### With Window Functions&#xA;&#xA;Ok! I don&#39;t know window functions, but I searched and found that we could partition things before we group them or order them in the final result set. That&#39;s what we want right?&#xA;&#xA;We had grouped the logs for each username and artist and counted the number of plays. Now we want to rank the artists for each user in the decreasing order of number of plays. &#xA;&#xA;So, we start with the same thing:&#xA;&#xA;```sql&#xA;SELECT&#xA;    user_name,&#xA;    artist,&#xA;    COUNT(*) AS play_count&#xA;FROM listening_logs&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;```&#xA;&#xA;You can see that we have `user_name` column, what if we could separate out the users, and then we could rank the artists for each user separately.&#xA;&#xA;For that we can use `ROW_NUMBER` window function. This function needs a `PARTITION BY` clause which lets us create separate partitions based on certain columns, and then we can rank them using the `ORDER BY` clause as we do with ordinary statement. That becomes the row_number, which we can use as a rank to rank each artist for a user based on the number of time the user has played him/her.&#xA;&#xA;```sql&#xA;SELECT &#xA;    user_name,&#xA;    artist,&#xA;    COUNT(*) AS play_count,&#xA;    ROW_NUMBER() OVER (&#xA;        PARTITION BY user_name &#xA;        ORDER BY COUNT(*) DESC, artist&#xA;    ) AS ranks&#xA;FROM listening_logs&#xA;GROUP BY user_name, artist&#xA;```&#xA;&#xA;Here, `ROW_NUMBER` is a window function that assigns a rank to each row in the result set. It partitions the result set by `user_name` and orders the rows in the decreasing order of `COUNT(*)` and `artist` name.&#xA;&#xA;So, imagine this table:&#xA;&#xA;```&#xA;+-------------------+-------------------------------------+------------+-------+&#xA;|     user_name     |               artist                | play_count | ranks |&#xA;+-------------------+-------------------------------------+------------+-------+&#xA;| Abigail Hernandez | Ed Sheeran                          | 78         | 1     |&#xA;| Abigail Hernandez | Rotten Mango                        | 15         | 2     |&#xA;| Abigail Hernandez | Billie Eilish                       | 4          | 3     |&#xA;| Abigail Hernandez | Hans Zimmer                         | 4          | 4     |&#xA;| Abigail Hernandez | John Legend                         | 3          | 5     |&#xA;| Abigail Hernandez | John Williams                       | 3          | 6     |&#xA;| Abigail Hernandez | The Beatles                         | 3          | 7     |&#xA;| Abigail Hernandez | The Rolling Stones                  | 3          | 8     |&#xA;| Abigail Hernandez | Angels &amp; Airwaves                   | 2          | 9     |&#xA;| Abigail Hernandez | Bad Bunny                           | 2          | 10    |&#xA;| Abigail Hernandez | Beyonce                             | 2          | 11    |&#xA;| Abigail Hernandez | Coldplay                            | 2          | 12    |&#xA;| Abigail Hernandez | Foo Fighters                        | 2          | 13    |&#xA;| Abigail Hernandez | Fuerza Regida                       | 2          | 14    |&#xA;| Abigail Hernandez | Kendrick Lamar                      | 2          | 15    |&#xA;| Abigail Hernandez | Ludovico Einaudi                    | 2          | 16    |&#xA;| Abigail Hernandez | Mariah Carey                        | 2          | 17    |&#xA;| Abigail Hernandez | Pentatonix                          | 2          | 18    |&#xA;| Abigail Hernandez | SmartLess                           | 2          | 19    |&#xA;| Abigail Hernandez | The Weeknd                          | 2          | 20    |&#xA;| Abigail Hernandez | Adele                               | 1          | 21    |&#xA;| Abigail Hernandez | Ariana Grande                       | 1          | 22    |&#xA;| Abigail Hernandez | Armchair Expert With Dax Shepard    | 1          | 23    |&#xA;| Abigail Hernandez | Bruno Mars                          | 1          | 24    |&#xA;| Abigail Hernandez | Candace                             | 1          | 25    |&#xA;| Abigail Hernandez | Crime, Conspiracy, Cults and Murder | 1          | 26    |&#xA;| Abigail Hernandez | Green Day                           | 1          | 27    |&#xA;| Abigail Hernandez | Matt and Shanes Secret Podcast      | 1          | 28    |&#xA;| Abigail Hernandez | On Purpose With Jay Shetty          | 1          | 29    |&#xA;| Abigail Hernandez | Snow Patrol                         | 1          | 30    |&#xA;| Abigail Hernandez | Sufjan Stevens                      | 1          | 31    |&#xA;| Abigail Hernandez | Taylor Swift                        | 1          | 32    |&#xA;| Abigail Hernandez | The Mel Robbins Podcast             | 1          | 33    |&#xA;| Abigail Hernandez | Unseen                              | 1          | 34    |&#xA;| Adrian Cox        | Kendrick Lamar                      | 128        | 1     |&#xA;| Adrian Cox        | Stuff You Should Know               | 30         | 2     |&#xA;| Adrian Cox        | Fuerza Regida                       | 6          | 3     |&#xA;| Adrian Cox        | Pentatonix                          | 6          | 4     |&#xA;| Adrian Cox        | Taylor Swift                        | 6          | 5     |&#xA;| Adrian Cox        | Snow Patrol                         | 5          | 6     |&#xA;+-------------------+-------------------------------------+------------+-------+&#xA;```&#xA;&#xA;Now, we have ranked the artist for each user, there are `34` artist played by `Abigail Hernandez` so there are `34` ranks. Now, we need to filter out the top 3 artist for each user. That would be simple right?&#xA;&#xA;Just add the `WHERE` clause:&#xA;&#xA;```sql&#xA;SELECT&#xA;    user_name,&#xA;    artist,&#xA;    COUNT(*) AS play_count,&#xA;    ROW_NUMBER() OVER (&#xA;        PARTITION BY user_name &#xA;        ORDER BY COUNT(*) DESC, artist&#xA;    ) AS ranks&#xA;FROM listening_logs&#xA;WHERE ranks &lt;= 3&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;```&#xA;&#xA;Well not really!&#xA;&#xA;```&#xA;sqlite&gt; SELECT&#xA;    user_name,&#xA;    artist,&#xA;    COUNT(*) AS play_count,&#xA;    ROW_NUMBER() OVER (&#xA;        PARTITION BY user_name &#xA;        ORDER BY COUNT(*) DESC, artist&#xA;    )&#xA;FROM listening_logs&#xA;WHERE ranks &lt;= 3&#xA;GROUP BY user_name, artist&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;Run Time: real 0.000 user 0.000110 sys 0.000000&#xA;Parse error: misuse of aliased window function ranks&#xA;```&#xA;&#xA;We can&#39;t filter the window function column with a `WHERE` clause. At the time the `WHERE` clause is evaluated, the `SELECT` list (including ranks) has not been computed yet. So ranks doesn&#39;t exist yet!&#xA;&#xA;So, now what, so close yet so far!&#xA;&#xA;Let&#39;s wrap the `SELECT` in a subquery:&#xA;&#xA;```sql&#xA;SELECT user_name, artist, play_count&#xA;FROM (&#xA;    SELECT &#xA;        user_name,&#xA;        artist,&#xA;        COUNT(*) AS play_count,&#xA;        ROW_NUMBER() OVER (&#xA;            PARTITION BY user_name&#xA;            ORDER BY COUNT(*) DESC, artist&#xA;        ) AS ranks&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;) ranked&#xA;WHERE ranks &lt;= 3&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;```&#xA;&#xA;And there we have it!&#xA;&#xA;We wrapped the `SELECT` in a subquery, and now we can filter the window function column with a `WHERE` clause.&#xA;&#xA;We can even do it with a `CTE` i.e. common table expression. That&#39;s just a subquery with a name.&#xA;&#xA;Not sure when it could be handy, but here it is.&#xA;&#xA;```sql&#xA;WITH ranked AS (&#xA;    SELECT&#xA;        user_name,&#xA;        artist,&#xA;        COUNT(*) AS play_count,&#xA;        ROW_NUMBER() OVER (&#xA;            PARTITION BY user_name&#xA;            ORDER BY COUNT(*) DESC, artist&#xA;        ) AS ranks&#xA;    FROM listening_logs&#xA;    GROUP BY user_name, artist&#xA;)&#xA;SELECT user_name, artist, play_count&#xA;FROM ranked&#xA;WHERE ranks &lt;= 3&#xA;ORDER BY user_name, play_count DESC, artist;&#xA;```&#xA;&#xA;So, that is day 5, ok that is getting a little tricky now!&#xA;&#xA;Off to day 6.</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 4: WinterFest Volunteers</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-4</link>
      <description>Advent of SQL Day 4 WinterFest Volunteers It is day 4 of advent of SQL. No fuss, straight to the problem, the elves and humans are getting dumber as the days pr</description>
      <pubDate>Fri, 19 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL Day 4 WinterFest Volunteers&#xA;&#xA;It is day 4 of advent of SQL.&#xA;&#xA;No fuss, straight to the problem, the elves and humans are getting dumber as the days progress.&#xA;&#xA;Let&#39;s download the SQL inserts for the day.&#xA;&#xA;And load it into a SQLite shell.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS official_shifts;&#xA;DROP TABLE IF EXISTS last_minute_signups;&#xA;&#xA;CREATE TABLE official_shifts (&#xA;    id INT PRIMARY KEY,&#xA;    volunteer_name TEXT,&#xA;    role TEXT,&#xA;    shift_time TEXT,&#xA;    age_group TEXT,&#xA;    code TEXT&#xA;);&#xA;&#xA;CREATE TABLE last_minute_signups (&#xA;    id INT PRIMARY KEY,&#xA;    volunteer_name TEXT,&#xA;    assigned_task TEXT,&#xA;    time_slot TEXT&#xA;);&#xA;&#xA;INSERT INTO official_shifts (id, volunteer_name, role, shift_time, age_group, code) VALUES&#xA;    (1, &#39;Jude Thompson&#39;, &#39;choir_assistant&#39;, &#39;12:00 PM&#39;, &#39;senior&#39;, NULL),&#xA;    (2, &#39;Mateo Cruz&#39;, &#39;choir_assistant&#39;, &#39;12:00 PM&#39;, &#39;senior&#39;, NULL),&#xA;    (3, &#39;Olivia Dubois&#39;, &#39;choir_assistant&#39;, &#39;2:00 PM&#39;, &#39;teen&#39;, &#39;A1&#39;),&#xA;    (4, &#39;Jeff Bezos&#39;, &#39;choir_assistant&#39;, &#39;10:00 AM&#39;, &#39;adult&#39;, &#39;X7&#39;),&#xA;    (5, &#39;Kian Rahimi&#39;, &#39;stage_setup&#39;, &#39;12:00 PM&#39;, &#39;adult&#39;, &#39;X7&#39;),&#xA;    (6, &#39;Haruto Sato&#39;, &#39;cocoa_station&#39;, &#39;10:00 AM&#39;, &#39;adult&#39;, &#39;X7&#39;),&#xA;    (7, &#39;Uma Singh&#39;, &#39;parking_support&#39;, &#39;10:00 AM&#39;, &#39;adult&#39;, NULL),&#xA;    (8, &#39;Owen Scott&#39;, &#39;parking_support&#39;, &#39;10:00 AM&#39;, &#39;adult&#39;, &#39;X7&#39;),&#xA;    (9, &#39;Adil Rahman&#39;, &#39;stage_setup&#39;, &#39;2:00 PM&#39;, &#39;adult&#39;, &#39;A1&#39;),&#xA;    (10, &#39;Aaron Diaz&#39;, &#39;choir_assistant&#39;, &#39;2:00 PM&#39;, &#39;senior&#39;, &#39;X7&#39;),&#xA;    (11, &#39;Carter Lewis&#39;, &#39;cocoa_station&#39;, &#39;10:00 AM&#39;, &#39;senior&#39;, &#39;B2&#39;),&#xA;    (12, &#39;Anya Pavlov&#39;, &#39;stage_setup&#39;, &#39;10:00 AM&#39;, &#39;senior&#39;, &#39;OLD&#39;),&#xA;    (13, &#39;Ethan Brown&#39;, &#39;stage_setup&#39;, &#39;2:00 PM&#39;, &#39;adult&#39;, &#39;A1&#39;),&#xA;    (14, &#39;Lucia Fernandez&#39;, &#39;choir_assistant&#39;, &#39;12:00 PM&#39;, &#39;senior&#39;, &#39;X7&#39;),&#xA;    (15, &#39;Casey Morgan&#39;, &#39;choir_assistant&#39;, &#39;12:00 PM&#39;, &#39;teen&#39;, &#39;OLD&#39;);&#xA;&#xA;INSERT INTO last_minute_signups (id, volunteer_name, assigned_task, time_slot) VALUES&#xA;    (1, &#39;Jude Thompson&#39;, &#39;Choir&#39;, &#39;noon&#39;),&#xA;    (2, &#39;Mateo Cruz&#39;, &#39;choir&#39;, &#39;noon&#39;),&#xA;    (3, &#39;Olivia Dubois&#39;, &#39;choir&#39;, &#39;2 PM&#39;),&#xA;    (4, &#39;Jeff Bezos&#39;, &#39;choir assistant&#39;, &#39;10AM&#39;),&#xA;    (5, &#39;Kian Rahimi&#39;, &#39;stage setup&#39;, &#39;noon&#39;),&#xA;    (6, &#39;Haruto Sato&#39;, &#39;cocoa station&#39;, &#39;10AM&#39;),&#xA;    (7, &#39;Uma Singh&#39;, &#39;parking_support&#39;, &#39;10AM&#39;),&#xA;    (8, &#39;Owen Scott&#39;, &#39;parking&#39;, &#39;10AM&#39;),&#xA;    (9, &#39;Adil Rahman&#39;, &#39;Stage-Setup&#39;, &#39;2 PM&#39;),&#xA;    (10, &#39;Aaron Diaz&#39;, &#39;Choir&#39;, &#39;2 PM&#39;),&#xA;    (11, &#39;Carter Lewis&#39;, &#39;Cocoa Station&#39;, &#39;10AM&#39;),&#xA;    (12, &#39;Anya Pavlov&#39;, &#39;stage_setup&#39;, &#39;10AM&#39;),&#xA;    (13, &#39;Olivia Brown&#39;, &#39;stage setup&#39;, &#39;2 PM&#39;),&#xA;    (14, &#39;Lena Fischer&#39;, &#39;cocoa station&#39;, &#39;2 pm&#39;),&#xA;    (15, &#39;Nolan Murphy&#39;, &#39;parking-support&#39;, &#39;10AM&#39;);&#xA;```&#xA;&#xA;Once the data is loaded, let&#39;s sneak peak.&#xA;&#xA;```sql&#xA;SELECT * FROM official_shifts LIMIT 15;&#xA;```&#xA;&#xA;```sql&#xA;SELECT * FROM last_minute_signups LIMIT 15;&#xA;```&#xA;&#xA;Let&#39;s count how many rows in each table we have:&#xA;&#xA;```sql&#xA;SELECT COUNT(*) FROM official_shifts LIMIT 15;&#xA;SELECT COUNT(*) FROM last_minute_signups LIMIT 15;&#xA;```&#xA;&#xA;Alright, the data is visible and we can head on to the problem statement.&#xA;&#xA;&#xA;&#xA;```&#xA;$ sqlite3&#xA;SQLite version 3.45.1 2024-01-30 16:01:20&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day4_inserts.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE official_shifts (&#xA;    id INT PRIMARY KEY,&#xA;    volunteer_name TEXT,&#xA;    role TEXT,&#xA;    shift_time TEXT,&#xA;    age_group TEXT,&#xA;    code TEXT&#xA;);&#xA;CREATE TABLE last_minute_signups (&#xA;    id INT PRIMARY KEY,&#xA;    volunteer_name TEXT,&#xA;    assigned_task TEXT,&#xA;    time_slot TEXT&#xA;);&#xA;&#xA;sqlite&gt; SELECT * FROM official_shifts LIMIT 15;&#xA;1|Jude Thompson|choir_assistant|12:00 PM|senior|&#xA;2|Mateo Cruz|choir_assistant|12:00 PM|senior|&#xA;3|Olivia Dubois|choir_assistant|2:00 PM|teen|A1&#xA;4|Jeff Bezos|choir_assistant|10:00 AM|adult|X7&#xA;5|Kian Rahimi|stage_setup|12:00 PM|adult|X7&#xA;6|Haruto Sato|cocoa_station|10:00 AM|adult|X7&#xA;7|Uma Singh|parking_support|10:00 AM|adult|&#xA;8|Owen Scott|parking_support|10:00 AM|adult|X7&#xA;9|Adil Rahman|stage_setup|2:00 PM|adult|A1&#xA;10|Aaron Diaz|choir_assistant|2:00 PM|senior|X7&#xA;11|Carter Lewis|cocoa_station|10:00 AM|senior|B2&#xA;12|Anya Pavlov|stage_setup|10:00 AM|senior|OLD&#xA;13|Ethan Brown|stage_setup|2:00 PM|adult|A1&#xA;14|Lucia Fernandez|choir_assistant|12:00 PM|senior|X7&#xA;15|Casey Morgan|choir_assistant|12:00 PM|teen|OLD&#xA;&#xA;sqlite&gt; .mode table &#xA;sqlite&gt; SELECT * FROM official_shifts LIMIT 15;&#xA;+----+-----------------+-----------------+------------+-----------+------+&#xA;| id | volunteer_name  |      role       | shift_time | age_group | code |&#xA;+----+-----------------+-----------------+------------+-----------+------+&#xA;| 1  | Jude Thompson   | choir_assistant | 12:00 PM   | senior    |      |&#xA;| 2  | Mateo Cruz      | choir_assistant | 12:00 PM   | senior    |      |&#xA;| 3  | Olivia Dubois   | choir_assistant | 2:00 PM    | teen      | A1   |&#xA;| 4  | Jeff Bezos      | choir_assistant | 10:00 AM   | adult     | X7   |&#xA;| 5  | Kian Rahimi     | stage_setup     | 12:00 PM   | adult     | X7   |&#xA;| 6  | Haruto Sato     | cocoa_station   | 10:00 AM   | adult     | X7   |&#xA;| 7  | Uma Singh       | parking_support | 10:00 AM   | adult     |      |&#xA;| 8  | Owen Scott      | parking_support | 10:00 AM   | adult     | X7   |&#xA;| 9  | Adil Rahman     | stage_setup     | 2:00 PM    | adult     | A1   |&#xA;| 10 | Aaron Diaz      | choir_assistant | 2:00 PM    | senior    | X7   |&#xA;| 11 | Carter Lewis    | cocoa_station   | 10:00 AM   | senior    | B2   |&#xA;| 12 | Anya Pavlov     | stage_setup     | 10:00 AM   | senior    | OLD  |&#xA;| 13 | Ethan Brown     | stage_setup     | 2:00 PM    | adult     | A1   |&#xA;| 14 | Lucia Fernandez | choir_assistant | 12:00 PM   | senior    | X7   |&#xA;| 15 | Casey Morgan    | choir_assistant | 12:00 PM   | teen      | OLD  |&#xA;+----+-----------------+-----------------+------------+-----------+------+&#xA;&#xA;sqlite&gt; SELECT * FROM last_minute_signups LIMIT 15;&#xA;+----+----------------+-----------------+-----------+&#xA;| id | volunteer_name |  assigned_task  | time_slot |&#xA;+----+----------------+-----------------+-----------+&#xA;| 1  | Jude Thompson  | Choir           | noon      |&#xA;| 2  | Mateo Cruz     | choir           | noon      |&#xA;| 3  | Olivia Dubois  | choir           | 2 PM      |&#xA;| 4  | Jeff Bezos     | choir assistant | 10AM      |&#xA;| 5  | Kian Rahimi    | stage setup     | noon      |&#xA;| 6  | Haruto Sato    | cocoa station   | 10AM      |&#xA;| 7  | Uma Singh      | parking_support | 10AM      |&#xA;| 8  | Owen Scott     | parking         | 10AM      |&#xA;| 9  | Adil Rahman    | Stage-Setup     | 2 PM      |&#xA;| 10 | Aaron Diaz     | Choir           | 2 PM      |&#xA;| 11 | Carter Lewis   | Cocoa Station   | 10AM      |&#xA;| 12 | Anya Pavlov    | stage_setup     | 10AM      |&#xA;| 13 | Olivia Brown   | stage setup     | 2 PM      |&#xA;| 14 | Lena Fischer   | cocoa station   | 2 pm      |&#xA;| 15 | Nolan Murphy   | parking-support | 10AM      |&#xA;+----+----------------+-----------------+-----------+&#xA;&#xA;sqlite&gt; SELECT count(*) FROM official_shifts;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 250      |&#xA;+----------+&#xA;&#xA;sqlite&gt; SELECT count(*) FROM last_minute_signups;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 126      |&#xA;+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;## Problem&#xA;&#xA;Here&#39;s the challenge for day 4&#xA;&#xA;&gt; Using the official_shifts and last_minute_signups tables, create a combined de-duplicated volunteer list.&#xA;&gt; &#xA;&gt; Ensure the list has standardized role labels of Stage Setup, Cocoa Station, Parking Support, Choir Assistant, Snow Shoveling, Handwarmer Handout.&#xA;&gt; &#xA;&gt; Make sure that the timeslot formats follow John&#39;s official shifts format.&#xA;&#xA;What we have here is a official shift table which could have been entered by the system. However the `last_minute_shift` is messy and has been added from a sheet, so we need to clean it up and combine those two tables data into a single de-duplicated list of volunteers.&#xA;&#xA;Let&#39;s see what we got&#xA;&#xA;```sql&#xA;SELECT * FROM last_minute_signups;&#xA;```&#xA;&#xA;Ok, we have 126 records and the columns are:&#xA;1.  `volunteer_name` which doesn&#39;t look bad, &#xA;2. `assigned_task` which looks wonky&#xA;3. `time_slot` is just wild, we have wired definition of times there.&#xA;&#xA;And let&#39;s look at the `official_shifts`&#xA;&#xA;```sql&#xA;SELECT * FROM offical_shifts;&#xA;```&#xA;&#xA;This looks neat and tidy, nothing looking off from each other.&#xA;&#xA;So we need to make sure we are cleaning up the `last_minute_signups` before we merge them.&#xA;&#xA;```sql&#xA;SELECT DISTINCT assigned_task FROM last_minute_signups;&#xA;```&#xA;Ok, so casing is one thing we can see, `-` and ` ` space are the things to normalize, and then some inconsistent naming convention like `choir` and `choir assistant`, then `parking_support` and `parking`. We need to clean&#39;em up.&#xA;&#xA;We can search for &#xA;```sql&#xA;SELECT &#xA;    id,&#xA;    volunteer_name,&#xA;    time_slot,&#xA;    CASE &#xA;        WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39; &#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39;  &#xA;   END AS assigned_task&#xA;FROM last_minute_signups;&#xA;``` &#xA;&#xA;We just do a case match for:&#xA;- `LIKE &#39;choir%&#39;` which will match any case (`Choir`, `choir`) and also anything after `choir....` like `choir assistant`&#xA;- `LIKE &#39;stage%&#39;` which will match any case (`Stage`, `stage`) and also anything after `stage...` like `Stage-Setup`, `stage    setup` or `stage_setup`.&#xA;- `LIKE &#39;%cocoa&#39;` which will match any case (`Cocoa`, `cocoa`) and also anything before or after `...cocoa...` like `Cocoa Station`, `cocoa station`, etc.&#xA;- `LIKE &#39;parking%&#39;` which will match any case (`Parking`, `parking`) and also anything after `parking...` like `parking-support` or `parking_support`, etc.&#xA;- `LIKE &#39;hand%&#39;` which will match any case (`Hand`, `hand`) and also anything after `hand...` like `handwarmer handout`, `handwarmers`, `Handwarmer-Handout`, etc.&#xA;- `LIKE &#39;%shovel%&#39;` which will match any case (`Shovel`, `shovel`) and also anything before and after `...shovel...`  like `Snow-Shoveling`, `shovel`, `snow shoveling`, etc.&#xA;&#xA;Ok, now this looks unified for the `assigned_task`.&#xA;&#xA;```&#xA;sqlite&gt; SELECT &#xA;    id,&#xA;    volunteer_name,&#xA;    time_slot,&#xA;    CASE &#xA;        WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS assigned_task &#xA;FROM last_minute_signups;&#xA;&#xA;+-----+-------------------+-----------+-----------------+&#xA;| id  |  volunteer_name   | time_slot |  assigned_task  |&#xA;+-----+-------------------+-----------+-----------------+&#xA;| 1   | Jude Thompson     | noon      | choir_assistant |&#xA;| 2   | Mateo Cruz        | noon      | choir_assistant |&#xA;| 3   | Olivia Dubois     | 2 PM      | choir_assistant |&#xA;| 4   | Jeff Bezos        | 10AM      | choir_assistant |&#xA;| 5   | Kian Rahimi       | noon      | stage_setup     |&#xA;| 6   | Haruto Sato       | 10AM      | cocoa_station   |&#xA;| 7   | Uma Singh         | 10AM      | parking_support |&#xA;| 8   | Owen Scott        | 10AM      | parking_support |&#xA;| 9   | Adil Rahman       | 2 PM      | stage_setup     |&#xA;| 10  | Aaron Diaz        | 2 PM      | choir_assistant |&#xA;| 11  | Carter Lewis      | 10AM      | cocoa_station   |&#xA;| 12  | Anya Pavlov       | 10AM      | stage_setup     |&#xA;| 13  | Olivia Brown      | 2 PM      | stage_setup     |&#xA;| 14  | Lena Fischer      | 2 pm      | cocoa_station   |&#xA;| 15  | Nolan Murphy      | 10AM      | parking_support |&#xA;+-----+-------------------+-----------+-----------------+&#xA;&#xA;```&#xA;&#xA;&#xA;&#xA;We need to make it for the time slot too.&#xA;&#xA;```sql&#xA;SELECT DISTINCT time_slot FROM last_minute_signups&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT DISTINCT time_slot FROM last_minute_signups;&#xA;+-----------+&#xA;| time_slot |&#xA;+-----------+&#xA;| noon      |&#xA;| 2 PM      |&#xA;| 10AM      |&#xA;| 2 pm      |&#xA;| 10 am     |&#xA;+-----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Since we have to follow the `official_shifts` let&#39;s check over there.&#xA;```sql&#xA;SELECT distinct shift_time FROM official_shifts;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT distinct shift_time FROM official_shifts;&#xA;+------------+&#xA;| shift_time |&#xA;+------------+&#xA;| 12:00 PM   |&#xA;| 2:00 PM    |&#xA;| 10:00 AM   |&#xA;+------------+&#xA;```&#xA;&#xA;Ok we have only 3 times to change.&#xA;&#xA;Alright, we have some small things to standardize.&#xA;&#xA;```sql&#xA;SELECT &#xA;    DISTINCT CASE &#xA;        WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39;&#xA;    END AS time_slot &#xA;FROM last_minute_signups;&#xA;```&#xA;&#xA;```&#xA;sqlite&gt; SELECT &#xA;    DISTINCT CASE &#xA;        WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39;&#xA;        WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39;&#xA;    END AS time_slot &#xA;FROM last_minute_signups;&#xA;&#xA;+-----------+&#xA;| time_slot |&#xA;+-----------+&#xA;| 12:00 PM  |&#xA;| 2:00 PM   |&#xA;| 10:00 AM  |&#xA;+-----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;So, we have simply standardize the time_slots.&#xA;&#xA;- `LIKE &#39;2%&#39;` will match any case but we need `LIKE` to match the `%` rest of the stuff after `2`.&#xA;- `LIKE &#39;10%&#39;` will match any case but we need `LIKE` to match the `%` rest of the stuff after `10`. We can&#39;t keep it `LIKE &#39;1%&#39;` as it will match `1:00` as well&#xA;- `LIKE &#39;noon&#39;` will match any case of `noon` like `NOON` or `Noon`, etc. And we need to cast it to the `HH:MM AM or PM` format.&#xA;&#xA;So, now we can combine them.&#xA;&#xA;```sql&#xA;SELECT id, volunteer_name,&#xA;    CASE                                                                                                                                                               &#xA;        WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS role, &#xA;    CASE&#xA;        WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 PM&#39;&#xA;    END AS shift_time &#xA;FROM last_minute_signups;&#xA;```&#xA;Just changed the column names from `assigned_task` to `role` and `time_slot` to `shift_time` as per the name convention in the `official_shifts` table.&#xA;Phew! its a long statement.&#xA;&#xA;```&#xA;sqlite&gt; SELECT id, volunteer_name, CASE                                                                                                                                                               WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS assigned_task, CASE WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39; WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39; WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39; END AS time_slot FROM last_minute_signups;&#xA;&#xA;+-----+-------------------+-----------------+-----------+&#xA;| id  |  volunteer_name   |  assigned_task  | time_slot |&#xA;+-----+-------------------+-----------------+-----------+&#xA;| 1   | Jude Thompson     | choir_assistant | 12:00 PM  |&#xA;| 2   | Mateo Cruz        | choir_assistant | 12:00 PM  |&#xA;| 3   | Olivia Dubois     | choir_assistant | 2:00 PM   |&#xA;| 4   | Jeff Bezos        | choir_assistant | 10:00 AM  |&#xA;| 5   | Kian Rahimi       | stage_setup     | 12:00 PM  |&#xA;| 6   | Haruto Sato       | cocoa_station   | 10:00 AM  |&#xA;| 7   | Uma Singh         | parking_support | 10:00 AM  |&#xA;| 8   | Owen Scott        | parking_support | 10:00 AM  |&#xA;| 9   | Adil Rahman       | stage_setup     | 2:00 PM   |&#xA;| 10  | Aaron Diaz        | choir_assistant | 2:00 PM   |&#xA;| 11  | Carter Lewis      | cocoa_station   | 10:00 AM  |&#xA;| 12  | Anya Pavlov       | stage_setup     | 10:00 AM  |&#xA;| 13  | Olivia Brown      | stage_setup     | 2:00 PM   |&#xA;| 14  | Lena Fischer      | cocoa_station   | 2:00 PM   |&#xA;| 15  | Nolan Murphy      | parking_support | 10:00 AM  |&#xA;+-----+-------------------+-----------------+-----------+&#xA;```&#xA;&#xA;SO, now we have the table of `last_minute_signups` cleaned up, just with select, we can update them if needed.&#xA;&#xA;We now need to combine the both tables, cleaned up `last_minute_signups` and the `official_shifts`, we can use `UNION` to take out the duplicates from the two selection.&#xA;REMEMBER to order the rows correctly in both the tables.&#xA;- volunteer_name&#xA;- role&#xA;- shift_time&#xA;&#xA;I don&#39;t think name should be same, but I am keeping it same for clarity.&#xA;&#xA;Why `UNION`&#xA;- Because we have data in both the tables.&#xA;- We don&#39;t have a relation in both of the tables, those are the same tables just that the columns are not cleaned or in proper format.&#xA;- We want to grab all of them from one, all from other table, and remove the duplicates, that&#39;s a definition of `UNION`&#xA;&#xA;We can&#39;t use `UNION ALL` as it would include all the rows from both the tables without removing duplicates.&#xA;&#xA;```sql&#xA;SELECT volunteer_name, &#xA;    CASE &#xA;        WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS assigned_task,&#xA;    CASE &#xA;        WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39;&#xA;    END AS time_slot&#xA;FROM last_minute_signups &#xA;UNION  &#xA;SELECT &#xA;    volunteer_name,&#xA;    role,&#xA;    shift_time&#xA;FROM official_shifts &#xA;ORDER BY volunteer_name;&#xA;&#xA;```&#xA;&#xA;Ok, that is a mess, isn&#39;t it?&#xA;&#xA;```&#xA;sqlite&gt; SELECT volunteer_name, &#xA;    CASE &#xA;        WHEN assigned_task LIKE &#39;choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS assigned_task,&#xA;    CASE &#xA;        WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39;&#xA;        WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39;&#xA;    END AS time_slot&#xA;FROM last_minute_signups &#xA;UNION &#xA;SELECT &#xA;    volunteer_name,&#xA;    role,&#xA;    shift_time&#xA;FROM official_shifts &#xA;ORDER BY volunteer_name;&#xA;&#xA;+-------------------+-----------------+-----------+&#xA;|  volunteer_name   |  assigned_task  | time_slot |&#xA;+-------------------+-----------------+-----------+&#xA;| Aaron Carter      | parking_support | 2:00 PM   |&#xA;| Aaron Diaz        | choir_assistant | 2:00 PM   |&#xA;| Aaron Diaz        | choir_assistant | 2:00 PM   |&#xA;| Aaron Evans       | cocoa_station   | 2:00 PM   |&#xA;| Aaron Francis     | hand_warmer     | 2:00 PM   |&#xA;| Abigail Hernandez | stage_setup     | 10:00 AM  |&#xA;| Adam King         | stage_setup     | 10:00 AM  |&#xA;| Adil Foster       | stage_setup     | 2:00 PM   |&#xA;| Adil Rahman       | stage_setup     | 2:00 PM   |&#xA;| Adil Rahman       | stage_setup     | 2:00 PM   |&#xA;| Adrian Cox        | cocoa_station   | 10:00 AM  |&#xA;| Aisha Bennett     | cocoa_station   | 12:00 PM  |&#xA;| Aisha Khan        | choir_assistant | 12:00 PM  |&#xA;| Aisha Khan        | choir_assistant | 12:00 PM  |&#xA;| Aisha Mohammed    | cocoa_station   | 2:00 PM   |&#xA;+-------------------+-----------------+-----------+&#xA;```&#xA;&#xA;There we have it.&#xA;&#xA;Let&#39;s count the number of distinct volunteers in the shifts.&#xA;&#xA;```sql&#xA;SELECT COUNT(*) FROM (SELECT volunteer_name, role, shift_time FROM official_shifts UNION  SELECT volunteer_name, CASE &#xA;        WHEN assigned_task LIKE &#39;%choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;%stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;%parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;%hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS role, CASE WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39; WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39; WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39; END AS shift_time FROM last_minute_signups) AS volunteers;&#xA;```&#xA;&#xA;Just counted the full union of the statement using `SELECT COUNT(*) FROM &lt;THE FULL QUERY&gt; AS volunteers)&#xA;&#xA;```&#xA;sqlite&gt; SELECT COUNT(*) FROM (SELECT volunteer_name, role, shift_time FROM official_shifts UNION  SELECT volunteer_name, CASE &#xA;        WHEN assigned_task LIKE &#39;%choir%&#39; THEN &#39;choir_assistant&#39;&#xA;        WHEN assigned_task LIKE &#39;%stage%&#39; THEN &#39;stage_setup&#39;&#xA;        WHEN assigned_task LIKE &#39;%cocoa%&#39; THEN &#39;cocoa_station&#39;&#xA;        WHEN assigned_task LIKE &#39;%parking%&#39; THEN &#39;parking_support&#39;&#xA;        WHEN assigned_task LIKE &#39;%hand%&#39; THEN &#39;hand_warmer&#39;&#xA;        WHEN assigned_task LIKE &#39;%shovel%&#39; THEN &#39;snow_showel&#39; &#xA;    END AS role, CASE WHEN time_slot LIKE &#39;2%&#39; THEN &#39;2:00 PM&#39; WHEN time_slot LIKE &#39;noon&#39; THEN &#39;12:00 PM&#39; WHEN time_slot LIKE &#39;10%&#39; THEN &#39;10:00 AM&#39; END AS shift_time FROM last_minute_signups) AS volunteers;&#xA;&#xA;+------------+&#xA;| volunteers |&#xA;+------------+&#xA;| 284        |&#xA;+------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;So, we have around `284` rows. Looks good.&#xA;&#xA;Pinebrook can see the volunteer list now. The cleaned one.&#xA;&#xA;Off to day 5!</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 3: Hotline Messages</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-3</link>
      <description>Advent of SQL Day 3 - Hotline Messages This is day 3 from the Advent of SQL Grab the SQL Statements Let&#39;s take the insert statements i.e. to create and populate</description>
      <pubDate>Thu, 18 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Advent of SQL Day 3 - Hotline Messages&#xA;&#xA;This is day 3 from the Advent of SQL&#xA;&#xA;## Grab the SQL Statements&#xA;&#xA;Let&#39;s take the insert statements i.e. to create and populate tables and rows into the database. I am using SQLite.&#xA;&#xA;It works without any special shenanigans, as it was intended to used for Postgres, but the table and use case looks very simple, so nothing specific to Postgres used yet! We are good!&#xA;&#xA;Here is the SQL setup, if you want to play it in the playground here:&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS hotline_messages;&#xA;&#xA;CREATE TABLE hotline_messages (&#xA;    id INT PRIMARY KEY,&#xA;    caller_name TEXT,&#xA;    transcript TEXT,&#xA;    tag TEXT,&#xA;    status TEXT&#xA;);&#xA;&#xA;INSERT INTO hotline_messages (id, caller_name, transcript, tag, status) VALUES&#xA;    (1, &#39;Saanvi A.&#39;, &#39;I just found a refrigerator portal that leads to a disco party hosted by dancing llamas—please send help!&#39;, &#39;possible dragon&#39;, &#39;spam&#39;),&#xA;    (2, &#39;Fatima Q.&#39;, &#39;Hi Santa, I would love a magical unicorn that lights up at night!&#39;, &#39;wish list&#39;, NULL),&#xA;    (3, &#39;Lillian Z.&#39;, &#39;Hi Santa, I would love the magical fairy garden set, please!&#39;, &#39;wish list&#39;, &#39;approved&#39;),&#xA;    (4, &#39;Carter Y.&#39;, &#39;Thank you, Santa, for making Christmas so special with your wonderful spirit!&#39;, &#39;thank you&#39;, &#39;approved&#39;),&#xA;    (5, &#39;Omar R.&#39;, &#39;Hi Santa, I would love a rainbow unicorn plushie that has a glittery horn!&#39;, &#39;wish list&#39;, &#39;approved&#39;),&#xA;    (6, &#39;Diego Y.&#39;, &#39;Hi Santa, I would love a magical unicorn plushie that glows in the dark!&#39;, &#39;wish list&#39;, NULL),&#xA;    (7, &#39;Layla X.&#39;, &#39;Thank you, Santa, for spreading joy and magic every Christmas!&#39;, &#39;thank you&#39;, NULL),&#xA;    (8, &#39;Sophia K.&#39;, &#39;Santa, my cat said she wants to visit the candy cane forest next week.&#39;, NULL, NULL),&#xA;    (9, &#39;Eli H.&#39;, &#39;Hi Santa, I would love the magical fairy castle with twinkling lights!&#39;, &#39;wish list&#39;, &#39;approved&#39;),&#xA;    (10, &#39;Logan F.&#39;, &#39;Santa, I think the reindeer are starting a band with the garden gnomes.&#39;, &#39;needs clarification&#39;, NULL),&#xA;    (11, &#39;Carlos P.&#39;, &#39;Thank you, Santa, for making Christmas so special every year!&#39;, &#39;thank you&#39;, NULL),&#xA;    (12, &#39;Zain G.&#39;, &#39;Thank you, Santa, for bringing joy to all the children around the world!&#39;, &#39;thank you&#39;, NULL),&#xA;    (13, &#39;Haruto R.&#39;, &#39;Thank you, Santa, for spreading so much joy and magic every Christmas!&#39;, &#39;thank you&#39;, &#39;approved&#39;),&#xA;    (14, &#39;Oliver L.&#39;, &#39;Thank you, Santa, for spreading joy and making Christmas extra special!&#39;, &#39;thank you&#39;, NULL),&#xA;    (15, &#39;Luca M.&#39;, &#39;Hi Santa, could I please have the super cool glow-in-the-dark rocket ship?&#39;, &#39;wish list&#39;, NULL),&#xA;    (16, &#39;Samuel C.&#39;, &#39;sorry, Santa, my teddy bear said he wants to be a reindeer this year.&#39;, &#39;needs clarification&#39;, NULL);&#xA;```&#xA;&#xA;Here&#39;s the setup I did to check the data.&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages;&#xA;```&#xA;&#xA;```plaintext&#xA;$ sqlite3&#xA;SQLite version 3.45.1 2024-01-30 16:01:20&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;&#xA;sqlite&gt; .read day3-inserts.sql&#xA;&#xA;sqlite&gt; .schema&#xA;CREATE TABLE hotline_messages (&#xA;    id INT PRIMARY KEY,&#xA;    caller_name TEXT,&#xA;    transcript TEXT,&#xA;    tag TEXT,&#xA;    status TEXT&#xA;);&#xA;&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; SELECT * FROM hotline_messages LIMIT 10;&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| id | caller_name |                          transcript                          |         tag         |  status  |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 1  | Saanvi A.   | I just found a refrigerator portal that leads to a disco par | possible dragon     | spam     |&#xA;|    |             | ty hosted by dancing llamas—please send help!                |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 2  | Fatima Q.   | Hi Santa, I would love a magical unicorn that lights up at n | wish list           |          |&#xA;|    |             | ight!                                                        |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 3  | Lillian Z.  | Hi Santa, I would love the magical fairy garden set, please! | wish list           | approved |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 4  | Carter Y.   | Thank you, Santa, for making Christmas so special with your  | thank you           | approved |&#xA;|    |             | wonderful spirit!                                            |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 5  | Omar R.     | Hi Santa, I would love a rainbow unicorn plushie that has a  | wish list           | approved |&#xA;|    |             | glittery horn!                                               |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 6  | Diego Y.    | Hi Santa, I would love a magical unicorn plushie that glows  | wish list           |          |&#xA;|    |             | in the dark!                                                 |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 7  | Layla X.    | Thank you, Santa, for spreading joy and magic every Christma | thank you           |          |&#xA;|    |             | s!                                                           |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 8  | Sophia K.   | Santa, my cat said she wants to visit the candy cane forest  |                     |          |&#xA;|    |             | next week.                                                   |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 9  | Eli H.      | Hi Santa, I would love the magical fairy castle with twinkli | wish list           | approved |&#xA;|    |             | ng lights!                                                   |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 10 | Logan F.    | Santa, I think the reindeer are starting a band with the gar | needs clarification |          |&#xA;|    |             | den gnomes.                                                  |                     |          |&#xA;+----+-------------+--------------------------------------------------------------+---------------------+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;So, we just have one table, called `hotline_messages` and it has a few columns like&#xA;&#xA;1. `caller_name`&#xA;    &#xA;2. `transcript`&#xA;    &#xA;3. `tag`&#xA;    &#xA;4. `status`&#xA;    &#xA;&#xA;What do we want to do with those?&#xA;&#xA;Well! Let&#39;s get into the problem statement.&#xA;&#xA;## Problem&#xA;&#xA;Here goes the challenge for day 3&#xA;&#xA;&gt; Using the `hotline_messages` table, update any record that has &#34;sorry&#34; (case insensitive) in the transcript and doesn&#39;t currently have a status assigned to have a status of &#34;approved&#34;.&#xA;&gt; &#xA;&gt; Then delete any records where the tag is &#34;penguin prank&#34;, &#34;time-loop advisory&#34;, &#34;possible dragon&#34;, or &#34;nonsense alert&#34; or if the caller&#39;s name is &#34;Test Caller&#34;.&#xA;&gt; &#xA;&gt; After updating and deleting the records as described, write a final query that returns how many messages currently have a status of &#34;approved&#34; and how many still need to be reviewed (i.e., status is NULL).&#xA;&#xA;It&#39;s divided into 3 parts, so we need three queries? Maybe, I don&#39;t want to have a single long query for doing all of these. And after reading it, it seems it should not be a single query, it can be 2 queries, one is for updation and the other for selection after those updates.&#xA;&#xA;So, we have to do three things.&#xA;&#xA;1. Find the records which have `sorry` in the transcript text and mark their `status` as `approved` (What a lovely gesture)&#xA;    &#xA;2. Find all records with the tags as either `penguin prank`, `time-loop advisory`, `possible dragon`, or `nonsense alert` and even if the `caller_name` is `Test Caller` then delete those records, yes take&#39;em out of my way.&#xA;    &#xA;3. After doing those 2 things, we have to count the number of records with `status` as `approved` and the number of records that are still not `approved` (they are in review or the status is `NULL` )&#xA;    &#xA;&#xA;So, let&#39;s do them step by step.&#xA;&#xA;### Be generous&#xA;&#xA;Let&#39;s be generous like Santa says and mark the records with `status` as `approved` whose transcript have the word `sorry` in them. Let those children be gifted their reward of being generous and humble.&#xA;&#xA;How do that in SQL, well let&#39;s first look at what we are updating.&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;```&#xA;&#xA;So, will it be sufficient? I think so.&#xA;&#xA;Because&#xA;&#xA;* `LIKE` is **case insensitive**, so it can catch `sorry`, `Sorry`, `SoRRY`, `sorrY`&#xA;    &#xA;* `%` before and after will catch the word `sorry` in middle of the sentence and not necessarily in the start.&#xA;    &#xA;&#xA;I can see 104 rows selected with this condition. I always try to check before updation or deletion how many rows are affected. Because, sometimes we start `UPDATE hotline_messages SET status = &#39;approved&#39;` and forget the where! This gets worse for delete believe me!&#xA;&#xA;```sql&#xA;SELECT count(*) FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;SELECT count(*) FROM hotline_messages;&#xA;```&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT * FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;|  id  | caller_name  |                          transcript                          |         tag         |  status  |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 16   | Samuel C.    | sorry, Santa, my teddy bear said he wants to be a reindeer t | needs clarification |          |&#xA;|      |              | his year.                                                    |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 33   | Jacob F.     | sorry, Hi Santa, I would love the magical unicorn plushie th | wish list           |          |&#xA;|      |              | at glows in the dark!                                        |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 46   | Jun Y.       | sorry, Hi Santa, I would love a magical unicorn stuffed anim | wish list           |          |&#xA;|      |              | al that glows in the dark!                                   |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;&#xA;sqlite&gt; SELECT count(*) FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 104      |&#xA;+----------+&#xA;sqlite&gt; SELECT count(*) FROM hotline_messages ;--WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 1067     |&#xA;+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;So, once I know `104` rows will be affected out of `1067` I can create the update statement.&#xA;&#xA;We want to update the status and set it to `approved` for the rows which we selected just now (have `sorry` in the transcript text)&#xA;&#xA;```sql&#xA;UPDATE hotline_messages&#xA;SET status = &#39;approved&#39;&#xA;WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;```&#xA;&#xA;Now, when we select again&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;; &#xA;```&#xA;&#xA;All approved!&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT * FROM hotline_messages WHERE transcript LIKE &#39;%sorry%&#39;;&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;|  id  | caller_name  |                          transcript                          |         tag         |  status  |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 16   | Samuel C.    | sorry, Santa, my teddy bear said he wants to be a reindeer t | needs clarification | approved |&#xA;|      |              | his year.                                                    |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 33   | Jacob F.     | sorry, Hi Santa, I would love the magical unicorn plushie th | wish list           | approved |&#xA;|      |              | at glows in the dark!                                        |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;| 46   | Jun Y.       | sorry, Hi Santa, I would love a magical unicorn stuffed anim | wish list           | approved |&#xA;|      |              | al that glows in the dark!                                   |                     |          |&#xA;+------+--------------+--------------------------------------------------------------+---------------------+----------+&#xA;```&#xA;&#xA;To the next step then&#xA;&#xA;### Remove Spam&#xA;&#xA;To reiterate the second part of the challenge&#xA;&#xA;&gt; Then delete any records where the tag is &#34;penguin prank&#34;, &#34;time-loop advisory&#34;, &#34;possible dragon&#34;, or &#34;nonsense alert&#34; or if the caller&#39;s name is &#34;Test Caller&#34;.&#xA;&#xA;We basically need to&#xA;&#xA;Find all records with the tags as either `penguin prank`, `time-loop advisory`, `possible dragon`, or `nonsense alert` and even if the `caller_name` is `Test Caller` then delete those records, yes take&#39;em out of my way.&#xA;&#xA;So, again, select first update or delete later.&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;);&#xA;```&#xA;&#xA;Here `IN` is a great helper as we can do equivalent of this into a compact statement.&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages &#xA;WHERE &#xA;    tag = &#39;penguin prank&#39;&#xA;    OR tag = &#39;time-loop advisory&#39;&#xA;    OR tag = &#39;possible dragon&#39;&#xA;    OR tag = &#39;nonsense alert&#39;;&#xA;```&#xA;&#xA;That is a lot of `OR tag =` that is saved by `IN` a list of values. Handy little operator.&#xA;&#xA;The count here is `68`&#xA;&#xA;```sql&#xA;SELECT count(*) FROM hotline_messages WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;);&#xA;```&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT count(*) FROM hotline_messages WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;);&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 68       |&#xA;+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Also we need to check if the `caller_name` is `Test Caller`&#xA;&#xA;It could be `OR` here&#xA;&#xA;```sql&#xA;SELECT * FROM hotline_messages &#xA;WHERE &#xA;     tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;)&#xA;     OR caller_name = &#39;Test Caller&#39;;&#xA;```&#xA;&#xA;That&#39;s it let&#39;s count the number of rows we will be deleting soon.&#xA;&#xA;```plaintext&#xA;SELECT count(*) FROM hotline_messages&#xA;WHERE&#xA;     tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;) &#xA;     OR caller_name = &#39;Test Caller&#39;;&#xA;```&#xA;&#xA;So, we have `89` rows to deleted after considering the spamy tags and test callers out.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT count(*) FROM hotline_messages WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;) OR caller_name = &#39;Test Caller&#39;;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 89       |&#xA;+----------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Let&#39;s get the spam outta here!&#xA;&#xA;```sql&#xA;DELETE FROM hotline_messages&#xA;WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;) OR caller_name = &#39;Test Caller&#39;;&#xA;```&#xA;&#xA;Phew! Done 89 spammy records removed! Santa might be relieved.&#xA;&#xA;```plaintext&#xA;sqlite&gt; DELETE FROM hotline_messages&#xA;WHERE tag IN (&#39;penguin prank&#39;, &#39;time-loop advisory&#39;, &#39;possible dragon&#39;, &#39;nonsense alert&#39;) OR caller_name = &#39;Test Caller&#39;;&#xA;&#xA;sqlite&gt; SELECT changes();&#xA;+-----------+&#xA;| changes() |&#xA;+-----------+&#xA;| 89        |&#xA;+-----------+&#xA;```&#xA;&#xA;The changes are done, now we simply have to select and count the things which are approved and in review.&#xA;&#xA;### Count&#39;em down&#xA;&#xA;* After doing those 2 things, we have to count the number of records with `status` as `approved` and the number of records that are still not `approved` (they are in review or the status is `NULL` )&#xA;    &#xA;&#xA;So, we need to get the count of&#xA;&#xA;1. Records with `status` as `approved`&#xA;    &#xA;2. Records with `status` as `NULL`&#xA;    &#xA;&#xA;#### Separate Queries&#xA;&#xA;This looks straight forward, you can write two separate queries for doing the things.&#xA;&#xA;```sql&#xA;SELECT COUNT(*) as approved_count FROM hotline_messages WHERE status = &#39;approved&#39;;&#xA;SELECT COUNT(*) as in_review_count FROM hotline_messages WHERE status IS NULL;&#xA;```&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT COUNT(*) as approved_count FROM hotline_messages WHERE status = &#39;approved&#39;;&#xA;SELECT COUNT(*) as in_review_count FROM hotline_messages WHERE status IS NULL;&#xA;+----------------+&#xA;| approved_count |&#xA;+----------------+&#xA;| 477            |&#xA;+----------------+&#xA;+-----------------+&#xA;| in_review_count |&#xA;+-----------------+&#xA;| 501             |&#xA;+-----------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;#### Group by Status&#xA;&#xA;But however, can we do it in 1 query?&#xA;&#xA;Think!&#xA;&#xA;There are just 2 types of status right?&#xA;&#xA;Let&#39;s check&#xA;&#xA;```sql&#xA;SELECT DISTINCT status FROM hotline_messages;&#xA;```&#xA;&#xA;Hmm! 2? `NULL` and `approved` !&#xA;&#xA;```sql&#xA;sqlite&gt; SELECT DISTINCT status FROM hotline_messages;&#xA;+----------+&#xA;|  status  |&#xA;+----------+&#xA;|          |&#xA;| approved |&#xA;+----------+&#xA;```&#xA;&#xA;So we can simply do the same thing, but just group by `status` right? Like so:&#xA;&#xA;```sql&#xA;SELECT status, COUNT(*) as count&#xA;FROM hotline_messages&#xA;GROUP BY status;&#xA;```&#xA;&#xA;And this should give us back the two rows with the count of `NULL` and `approved` .&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT status, COUNT(*) as count&#xA;FROM hotline_messages&#xA;GROUP BY status; &#xA;+----------+-------+&#xA;|  status  | count |&#xA;+----------+-------+&#xA;|          | 501   |&#xA;| approved | 477   |&#xA;+----------+-------+&#xA;```&#xA;&#xA;Is there a better way?&#xA;&#xA;This looks a little wired! Like status is empty (NULL) and it kind of makes a little wired view for people to look at, can we do something different?&#xA;&#xA;#### Cases when then else&#xA;&#xA;This is simple use case for a `CASE WHEN ... THEN ... ELSE ... END`&#xA;&#xA;For each when we can check certain conditions and do certain things or do other thing.&#xA;&#xA;In this case, if the status is `approved`, we can increment the count to 1 or we can keep 0, similarly the other when can be used for grouping the count of status being `NULL` .&#xA;&#xA;```sql&#xA;SELECT &#xA;    COUNT(CASE WHEN status = &#39;approved&#39; THEN 1 END) AS approved_count,&#xA;    COUNT(CASE WHEN status IS NULL THEN 1 END) AS in_review_count&#xA;FROM &#xA;    hotline_messages;&#xA;```&#xA;&#xA;What this will do is, for each row, we will count up the number of either `approved_count`or `in_review_count` depending on the value of the `status` cell. If that is `approved` we increment the `approved` count else if that is `NULL` we increment the `in_review_count`.&#xA;&#xA;Slick!&#xA;&#xA;```sql&#xA;sqlite&gt; SELECT &#xA;    COUNT(CASE WHEN status = &#39;approved&#39; THEN 1 END) AS approved_count,&#xA;    COUNT(CASE WHEN status IS NULL THEN 1 END) AS in_review_count&#xA;FROM &#xA;    hotline_messages;&#xA;+----------------+-----------------+&#xA;| approved_count | in_review_count |&#xA;+----------------+-----------------+&#xA;| 477            | 501             |&#xA;+----------------+-----------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;That&#39;s it from day 3 hopefully Santa is happy, and in sight of getting madder as the elves get dumber.</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025 Day 2: Snowballs</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-2</link>
      <description>SQLog: Advent of SQL Day 2 Here we are on the day 2 of Advent of SQL As I said in the previous day this is in SQLite so I won&#39;t be doing it in the playground. S</description>
      <pubDate>Wed, 17 Dec 2025 00:00:00 UTC</pubDate>
      <content>## SQLog: Advent of SQL Day 2&#xA;&#xA;Here we are on the day 2 of Advent of SQL&#xA;&#xA;As I said in the previous day this is in SQLite so I won&#39;t be doing it in the playground. So here is your SQLite playground :)&#xA;&#xA;```sql&#xA;SELECT 1;&#xA;```&#xA;&#xA;From now on no setup straight to the problem!&#xA;&#xA;Let&#39;s download the .sql file for today&#39;s problem to see what data we are playing with.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS snowball_categories;&#xA;DROP TABLE IF EXISTS snowball_inventory;&#xA;&#xA;CREATE TABLE snowball_categories (&#xA;    id INT PRIMARY KEY,&#xA;    official_category TEXT&#xA;);&#xA;&#xA;CREATE TABLE snowball_inventory (&#xA;    id INT PRIMARY KEY,&#xA;    batch_id TEXT,&#xA;    category_name TEXT,&#xA;    quantity INT,&#xA;    status TEXT&#xA;);&#xA;```&#xA;&#xA;Well, neat and clean!&#xA;&#xA;Straight pulling it in the sqlite shell.&#xA;&#xA;```plaintext&#xA;$ sqlite3 &#xA;SQLite version 3.45.1 2024-01-30 16:01:20&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day2-inserts.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE snowball_categories (&#xA;    id INT PRIMARY KEY,&#xA;    official_category TEXT&#xA;);&#xA;CREATE TABLE snowball_inventory (&#xA;    id INT PRIMARY KEY,&#xA;    batch_id TEXT,&#xA;    category_name TEXT,&#xA;    quantity INT,&#xA;    status TEXT&#xA;);&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Worked right off the bat.&#xA;&#xA;Straight to problem now.&#xA;&#xA;If you are working in the playground, do add this full code with inserts in order to get a sense of what the data looks like, note that this below statements is not all data, just sharing to get you a taste of the problem.&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS snowball_categories;&#xA;DROP TABLE IF EXISTS snowball_inventory;&#xA;&#xA;CREATE TABLE snowball_categories (&#xA;    id INT PRIMARY KEY,&#xA;    official_category TEXT&#xA;);&#xA;&#xA;CREATE TABLE snowball_inventory (&#xA;    id INT PRIMARY KEY,&#xA;    batch_id TEXT,&#xA;    category_name TEXT,&#xA;    quantity INT,&#xA;    status TEXT&#xA;);&#xA;&#xA;INSERT INTO snowball_categories (id, official_category) VALUES&#xA;    (1, &#39;frost-flight deluxe&#39;),&#xA;    (2, &#39;north ridge compact&#39;),&#xA;    (3, &#39;glacier sphere (xl)&#39;),&#xA;    (4, &#39;polar precision microball&#39;),&#xA;    (5, &#39;everfrost training round&#39;),&#xA;    (6, &#39;arctic blast premium&#39;);&#xA;&#xA;INSERT INTO snowball_inventory (id, batch_id, category_name, quantity, status) VALUES&#xA;    (1, &#39;BATCH-35443-J&#39;, &#39;frost-flight deluxe&#39;, 19, NULL),&#xA;    (2, &#39;BATCH-59767-M&#39;, &#39;frost-flight deluxe&#39;, 41, &#39;incomplete&#39;),&#xA;    (3, &#39;BATCH-44795-B&#39;, &#39;frost-flight deluxe&#39;, 21, &#39;ready&#39;),&#xA;    (4, &#39;BATCH-23396-C&#39;, &#39;north ridge compact&#39;, 0, &#39;incomplete&#39;),&#xA;    (5, &#39;BATCH-88907-A&#39;, &#39;frost-flight deluxe&#39;, -2, &#39;incomplete&#39;),&#xA;    (6, &#39;BATCH-42662-D&#39;, &#39;frost-flight deluxe&#39;, 47, &#39;needs review&#39;),&#xA;    (7, &#39;BATCH-37460-V&#39;, &#39;north ridge compact&#39;, 43, &#39;ready&#39;),&#xA;    (8, &#39;BATCH-21395-S&#39;, &#39;frost-flight deluxe&#39;, -2, &#39;ready&#39;),&#xA;    (9, &#39;BATCH-36100-E&#39;, &#39;frost-flight deluxe&#39;, 46, &#39;ready&#39;),&#xA;    (10, &#39;BATCH-64987-H&#39;, &#39;frost-flight deluxe&#39;, 43, NULL),&#xA;    (11, &#39;BATCH-57576-Z&#39;, &#39;melty deluxe&#39;, -5, &#39;ready&#39;),&#xA;    (12, &#39;BATCH-56025-U&#39;, &#39;snowball v2&#39;, 11, &#39;ready&#39;),&#xA;    (13, &#39;BATCH-86556-W&#39;, &#39;snowball v2&#39;, 12, &#39;ready&#39;),&#xA;    (14, &#39;BATCH-83385-N&#39;, &#39;frost-flight deluxe&#39;, 38, &#39;incomplete&#39;),&#xA;    (15, &#39;BATCH-85156-M&#39;, &#39;prototype x-12&#39;, 28, &#39;incomplete&#39;),&#xA;    (16, &#39;BATCH-82135-F&#39;, &#39;north ridge compact&#39;, 32, &#39;incomplete&#39;),&#xA;    (17, &#39;BATCH-10074-T&#39;, &#39;frost-flight deluxe&#39;, 49, &#39;needs review&#39;),&#xA;    (18, &#39;BATCH-22676-L&#39;, &#39;frost-flight deluxe&#39;, 16, &#39;incomplete&#39;),&#xA;    (19, &#39;BATCH-31174-R&#39;, &#39;north ridge compact&#39;, 33, &#39;incomplete&#39;),&#xA;    (20, &#39;BATCH-41385-B&#39;, &#39;frost-flight deluxe&#39;, 4, &#39;ready&#39;),&#xA;    (21, &#39;BATCH-50404-L&#39;, &#39;frost-flight deluxe&#39;, -4, &#39;needs review&#39;),&#xA;    (22, &#39;BATCH-92240-F&#39;, &#39;north ridge compact&#39;, 20, &#39;ready&#39;),&#xA;    (23, &#39;BATCH-29198-J&#39;, &#39;beta test sphere&#39;, 0, &#39;incomplete&#39;),&#xA;    (24, &#39;BATCH-64987-H&#39;, &#39;glacier sphere (xl)&#39;, 18, &#39;needs review&#39;),&#xA;    (25, &#39;BATCH-80008-A&#39;, &#39;frost-flight deluxe&#39;, 3, &#39;incomplete&#39;),&#xA;    (26, &#39;BATCH-88907-A&#39;, &#39;polar precision microball&#39;, 48, &#39;incomplete&#39;),&#xA;    (27, &#39;BATCH-55830-J&#39;, &#39;north ridge compact&#39;, 0, &#39;needs review&#39;),&#xA;    (28, &#39;BATCH-69470-A&#39;, &#39;frost-flight deluxe&#39;, -3, &#39;incomplete&#39;),&#xA;    (29, &#39;BATCH-46211-R&#39;, &#39;frost-flight deluxe&#39;, -3, &#39;ready&#39;),&#xA;    (30, &#39;BATCH-18675-G&#39;, &#39;glacier sphere (xl)&#39;, -1, &#39;ready&#39;);&#xA;```&#xA;&#xA;## Problem&#xA;&#xA;Let&#39;s check the challenge of day 2&#xA;&#xA;&gt; Using the `snowball_inventory` and `snowball_categories` tables, write a query that returns valid snowball categories with the count of valid snowballs per category. Your final table should have the columns `official_category` and `total_usable_snowballs`. Sort the output from fewest to most `total_usable_snowballs`.&#xA;&#xA;So, we have two tables:&#xA;&#xA;1. snowball categories&#xA;    &#xA;2. snowball inventory&#xA;    &#xA;&#xA;The snowball categories looks a quite a small table with just the category name and the id which is not really the data, the name is.&#xA;&#xA;```sql&#xA;SELECT * FROM snowball_categories;&#xA;```&#xA;&#xA;Just 6 entries with some names of the categories, the column is official categories.&#xA;&#xA;```sql&#xA;sqlite&gt; SELECT * FROM snowball_categories;&#xA;+----+---------------------------+&#xA;| id |     official_category     |&#xA;+----+---------------------------+&#xA;| 1  | frost-flight deluxe       |&#xA;| 2  | north ridge compact       |&#xA;| 3  | glacier sphere (xl)       |&#xA;| 4  | polar precision microball |&#xA;| 5  | everfrost training round  |&#xA;| 6  | arctic blast premium      |&#xA;+----+---------------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;And the other table has oddles of data:&#xA;&#xA;```sql&#xA;SELECT * FROM snowball_inventory limit 10;&#xA;```&#xA;&#xA;So, we have quite a few columns,&#xA;&#xA;1. batch\_id&#xA;    &#xA;2. category\_name&#xA;    &#xA;3. quantity&#xA;    &#xA;4. status&#xA;    &#xA;&#xA;It has 200,000 rows, that&#39;s quite a lot.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT count(*) FROM snowball_inventory;&#xA;+----------+&#xA;| count(*) |&#xA;+----------+&#xA;| 200000   |&#xA;+----------+&#xA;&#xA;sqlite&gt; SELECT * FROM snowball_inventory limit 10;&#xA;+----+---------------+---------------------+----------+--------------+&#xA;| id |   batch_id    |    category_name    | quantity |    status    |&#xA;+----+---------------+---------------------+----------+--------------+&#xA;| 1  | BATCH-35443-J | frost-flight deluxe | 19       |              |&#xA;| 2  | BATCH-59767-M | frost-flight deluxe | 41       | incomplete   |&#xA;| 3  | BATCH-44795-B | frost-flight deluxe | 21       | ready        |&#xA;| 4  | BATCH-23396-C | north ridge compact | 0        | incomplete   |&#xA;| 5  | BATCH-88907-A | frost-flight deluxe | -2       | incomplete   |&#xA;| 6  | BATCH-42662-D | frost-flight deluxe | 47       | needs review |&#xA;| 7  | BATCH-37460-V | north ridge compact | 43       | ready        |&#xA;| 8  | BATCH-21395-S | frost-flight deluxe | -2       | ready        |&#xA;| 9  | BATCH-36100-E | frost-flight deluxe | 46       | ready        |&#xA;| 10 | BATCH-64987-H | frost-flight deluxe | 43       |              |&#xA;+----+---------------+---------------------+----------+--------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;What we have to do?&#xA;&#xA;&gt; return valid snowball categories with the count of valid snowballs per category&#xA;&#xA;That&#39;s a mouthful.&#xA;&#xA;The problem is trying to get at counting the number of inventory items per category I think.&#xA;&#xA;Because there are quite less categories and tons of inventory records.&#xA;&#xA;Also the `category_name` in the `snowball_inventory` table is not trustworthy.&#xA;&#xA;&gt; Santa hurried to the snowball storage board, but the situation only got stranger. Whole batches appeared twice. Some batches claimed they had negative snowballs (“a bookkeeping accident,” the elves muttered). Others had a quantity of zero but were still marked “Ready.” And many batches referenced categories that didn’t appear anywhere in the official Snowball Category Guide&#xA;&#xA;## Naive Approach&#xA;&#xA;The first approach I see is scan through all the inventory records and check if the category is in the `snowball_category` table and its quantity is positive or more than 0, also I wonder if the status needs to be checked as `ready`.&#xA;&#xA;&gt; “We need to know what we actually have left,” Santa said. “Not puddles. Not phantom batches. Real, usable, throw-ready snowballs.”&#xA;&#xA;**Real, throw-ready snowballs**&#xA;&#xA;Let&#39;s check how to do that with simple sub query.&#xA;&#xA;```sql&#xA;SELECT * FROM snowball_inventory&#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;;&#xA;```&#xA;&#xA;OK!&#xA;&#xA;What we did? Simply selected all the data from `snowball_inventory` table,&#xA;&#xA;in which the category name is matching either of the 6 categories in the `snowball_categories` table.&#xA;&#xA;Also the quantity is positive and not zero, also the status is set to be `ready`.&#xA;&#xA;This looks naive to me for using subquery because for each 200,000 records we will scan the `snowball_categories` . Ew!&#xA;&#xA;Wait its done! We need the count of each category! We need to group by the `category_name`&#xA;&#xA;```sql&#xA;SELECT * FROM snowball_inventory&#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name;&#xA;```&#xA;&#xA;```sql&#xA;sqlite&gt; SELECT * FROM snowball_inventory&#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name;&#xA;+-----+---------------+---------------------------+----------+--------+&#xA;| id  |   batch_id    |       category_name       | quantity | status |&#xA;+-----+---------------+---------------------------+----------+--------+&#xA;| 240 | BATCH-55793-L | arctic blast premium      | 45       | ready  |&#xA;| 163 | BATCH-75333-O | everfrost training round  | 23       | ready  |&#xA;| 3   | BATCH-44795-B | frost-flight deluxe       | 21       | ready  |&#xA;| 39  | BATCH-22704-V | glacier sphere (xl)       | 37       | ready  |&#xA;| 7   | BATCH-37460-V | north ridge compact       | 43       | ready  |&#xA;| 125 | BATCH-81987-E | polar precision microball | 47       | ready  |&#xA;+-----+---------------+---------------------------+----------+--------+&#xA;```&#xA;&#xA;That sort of looks wired right?&#xA;&#xA;Why?&#xA;&#xA;Because what happens to the quantity? Is that summed? averaged, or what just happened to the batch\_id?&#xA;&#xA;No it takes one value out of the `200 000` rows for each category. That&#39;s not what we want right?&#xA;&#xA;We want this&#xA;&#xA;&gt; return valid snowball categories with the count of valid snowballs per category&#xA;&#xA;So, we just want the category name and the count of those category. Basically count per category.&#xA;&#xA;```sql&#xA;SELECT category_name as official_category, sum(quantity) as total_usable_snowballs&#xA;FROM snowball_inventory &#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name;&#xA;```&#xA;&#xA;Here we are only fetching the `category_name` and the `count` which will bundle up all the counts from the grouped of the category\_name.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT category_name as official_category, sum(quantity) as total_usable_snowballs &#xA;FROM snowball_inventory &#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| frost-flight deluxe       | 952019                 |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| polar precision microball | 70773                  |&#xA;+---------------------------+------------------------+&#xA;sqlite&gt; &#xA;&#xA;```&#xA;&#xA;Not done yet!&#xA;&#xA;We need to order by the count.&#xA;&#xA;&gt; Sort the output from fewest to most `total_usable_snowballs`.&#xA;&#xA;```sql&#xA;SELECT category_name, sum(quantity)  as total_usable_snowballs&#xA;FROM snowball_inventory&#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name&#xA;ORDER BY total_usable_snowballs;&#xA;```&#xA;&#xA;We can provide `ORDER BY total_usable_snowballs ASC` but ascending is default. I prefer keeping things default, you can be explicit and mention it as `ASC` to make it clear and readable.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT category_name as official_category, sum(quantity) as total_usable_snowballs&#xA;FROM snowball_inventory &#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;&#xA;sqlite&gt; &#xA;```&#xA;&#xA;There it is!&#xA;&#xA;The solution to the day 2.&#xA;&#xA;But Santa, I am using subqueries. Is that fine?&#xA;&#xA;## Joins?&#xA;&#xA;We can use joins here, since we just require the count of each category.&#xA;&#xA;Which JOIN though?&#xA;&#xA;LEFT, RIGHT, INNER?&#xA;&#xA;any really?&#xA;&#xA;You don&#39;t choose the join based on the problem, you define your output columns and then choose the type of join that would give you the result.&#xA;&#xA;If the thing that is to be searched from left to right and then its a left join, i.e. join everything from the left. and so on.&#xA;&#xA;Here,&#xA;&#xA;I need all the `official_category` column right, which is in the `snowball_categories` table.&#xA;&#xA;If I assume the `snowball_categories` is on the left, I can join everything for that row in the left to match all the `category_name` rows in the `snowball_inventory` table which would be in the right. Like so.&#xA;&#xA;```sql&#xA;SELECT &#xA;    snowball_categories.official_category as official_category,&#xA;    SUM(snowball_inventory.quantity) as total_usable_snowballs&#xA;FROM snowball_categories&#xA;LEFT JOIN snowball_inventory&#xA;     ON snowball_categories.official_category = snowball_inventory.category_name&#xA;     AND snowball_inventory.quantity &gt; 0&#xA;     AND snowball_inventory.status == &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;```&#xA;&#xA;We JOIN on the condition of `snowball_categories.official_category = snowball_inventory.category_name` which is to say, the `category_name` in the `snowball_inventory` table should match the `official category` from the `snowball_categories` table. Also the other conditions like `quantity` should be more than `0` and the `status` should be `ready`.&#xA;&#xA;We will still need to group by and order by as the joins will basically be the filtering criteria in which to select or reject the bad categories. However to group by the category name and obtain the count of each category we need to group by the name of the category.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT snowball_categories.official_category AS official_category, SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_categories&#xA;LEFT JOIN snowball_inventory&#xA;     ON snowball_categories.official_category = snowball_inventory.category_name&#xA;     AND snowball_inventory.quantity &gt; 0&#xA;     AND snowball_inventory.status == &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;```&#xA;&#xA;This is it! The same thing that we did for the subquery. Just the filtering part is different.&#xA;&#xA;How about INNER JOIN?&#xA;&#xA;`INNER JOIN` is different from `LEFT JOIN` as it will by default take only rows that match the condition, whereas `LEFT JOIN` will include records/rows from the `LEFT` table even if they are empty or `NULL` . Similar is `RIGHT JOIN`, it will include everything from the RIGHT table even if that is `NULL` so that all the records from the `RIGHT` table are in the result set.&#xA;&#xA;For INNER JOIN if we flip the tables end it won&#39;t matter, as it only relies on the condition and not on the order of where the tables are placed (left or right).&#xA;&#xA;```sql&#xA;SELECT&#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;```&#xA;&#xA;This is with INNER JOIN nothing changed here? did it?&#xA;&#xA;except the word `LEFT` is no longer there, the default join is `INNER JOIN` .&#xA;&#xA;In our case i don&#39;t think we have `NULL` values in the `snowball_categories` table or `snowball_inventory` table with the category name columns. So LEFT AND INNER JOIN won&#39;t be different in terms of the results.&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT&#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;sqlite&gt; &#xA;```&#xA;&#xA;However if we decide to make the `snowball_inventory` table as the `Left` table then does it make a difference?&#xA;&#xA;Well we don&#39;t want that to be the left table if we are doing a `LEFT` JOIN as it might list all the wrong categories in the result list as well even if they are not in the `official_category` those will be 0 or NULL but that would make a mess.&#xA;&#xA;We instead can do a RIGHT JOIN.&#xA;&#xA;With the tables order as&#xA;&#xA;1. snowball\_inventory&#xA;    &#xA;2. snowball\_categories&#xA;    &#xA;&#xA;And we will RIGHT JOIN that is select all the columns from the right to be in the result set even if the condition is not true. It will be NULL. But the case is not true for here, as there are only 6 non-NULL category names in the snowball\_categories table.&#xA;&#xA;```sql&#xA;&#xA;SELECT                                                       &#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;RIGHT JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;```&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT                                                       &#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;RIGHT JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;```&#xA;&#xA;However if we try to use `LEFT JOIN` with `snowball_inventory` as the `LEFT` table, we might get something wired.&#xA;&#xA;```sql&#xA;SELECT                                                &#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;LEFT JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;```&#xA;&#xA;We have 6 and 1 more category as empty ?&#xA;&#xA;That category is for all the rows that didn&#39;t met the criteria of JOIN but still as asked to `LEFT JOIN` it will try to show the results for all the rows in the left table which is the `snowball_inventory` table.&#xA;&#xA;```plaintext&#xA;&#xA;sqlite&gt; SELECT                                                &#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;LEFT JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;|                           | 1692814                |&#xA;+---------------------------+------------------------+&#xA;```&#xA;&#xA;## Timer&#xA;&#xA;Phew! Let&#39;s see which one of them is the best one&#xA;&#xA;Let&#39;s see the timings:&#xA;&#xA;```plaintext&#xA;.timer on&#xA;```&#xA;&#xA;### SUBQUERY&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT category_name, SUM(quantity)AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;WHERE&#xA;   category_name IN (SELECT official_category FROM snowball_categories)&#xA;   AND quantity &gt; 0&#xA;   AND status == &#39;ready&#39;&#xA;GROUP BY category_name&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|       category_name       | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;Run Time: real 0.077 user 0.073886 sys 0.002942&#xA;&#xA;sqlite&gt; &#xA;```&#xA;&#xA;### LEFT JOIN&#xA;&#xA;```plaintext&#xA;sqlite&gt;  SELECT snowball_categories.official_category AS official_category, SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_categories&#xA;LEFT JOIN snowball_inventory&#xA;     ON snowball_categories.official_category = snowball_inventory.category_name &#xA;     AND snowball_inventory.quantity &gt; 0&#xA;     AND snowball_inventory.status == &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;Run Time: real 0.182 user 0.171433 sys 0.009907&#xA;sqlite&gt; &#xA;```&#xA;&#xA;### INNER JOIN&#xA;&#xA;```plaintext&#xA;sqlite&gt;  SELECT&#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;Run Time: real 0.072 user 0.069623 sys 0.001929&#xA;sqlite&gt; &#xA;```&#xA;&#xA;### RIGHT JOIN&#xA;&#xA;```plaintext&#xA;sqlite&gt; SELECT                                                       &#xA;  snowball_categories.official_category AS official_category,&#xA;  SUM(snowball_inventory.quantity) AS total_usable_snowballs&#xA;FROM snowball_inventory&#xA;RIGHT JOIN snowball_categories&#xA;  ON snowball_categories.official_category = snowball_inventory.category_name&#xA;  AND snowball_inventory.quantity &gt; 0&#xA;  AND snowball_inventory.status = &#39;ready&#39;&#xA;GROUP BY official_category&#xA;ORDER BY total_usable_snowballs;&#xA;+---------------------------+------------------------+&#xA;|     official_category     | total_usable_snowballs |&#xA;+---------------------------+------------------------+&#xA;| arctic blast premium      | 11470                  |&#xA;| everfrost training round  | 24248                  |&#xA;| polar precision microball | 70773                  |&#xA;| glacier sphere (xl)       | 165158                 |&#xA;| north ridge compact       | 594119                 |&#xA;| frost-flight deluxe       | 952019                 |&#xA;+---------------------------+------------------------+&#xA;Run Time: real 0.151 user 0.149690 sys 0.001957&#xA;&#xA;sqlite&gt; &#xA;```&#xA;&#xA;Time wise ranking:&#xA;&#xA;1. INNER JOIN&#xA;    &#xA;2. SUBQUERY&#xA;    &#xA;3. RIGHT JOIN&#xA;    &#xA;4. LEFT JOIN&#xA;    &#xA;&#xA;That&#39;s quite quick, not much can be measured here. I am sure there are other ways to do this. But stopping here for today. Explored JOINs a little.&#xA;&#xA;See you for the day 3</content>
      <type>sqlog</type>
    </item>
    <item>
      <title>Advent of SQL 2025: Wish List</title>
      <link>https://www.meetgor.com/sqlog/advent-of-sql-2025-day-1</link>
      <description>Learning SQLite: Advent of SQL Day 1 I am trying to learn SQLite, I want to understand that database. It&#39;s quite simple yet the whole world uses it for various</description>
      <pubDate>Tue, 16 Dec 2025 00:00:00 UTC</pubDate>
      <content>## Learning SQLite: Advent of SQL Day 1&#xA;&#xA;I am trying to learn SQLite, I want to understand that database. It&#39;s quite simple yet the whole world uses it for various kinds of things ranging from developers&#39; toy database to spaceships. What a tiny engineering marvel!&#xA;&#xA;I am happy to see this happening: [Advent of SQL](https://databaseschool.com/series/advent-of-sql-videos-308)&#xA;&#xA;What a better time to learn more. I guess I want to start by exploring all the specificities of the INSERT statement in SQLite after exploring most of the things of the CREATE TABLE statement.&#xA;&#xA;But here I am jumping to this. Why? Because I want to solve something first before exploring other branches.&#xA;&#xA;Today I am going to try to solve the day 1 part.&#xA;&#xA;&gt; **NOTE:** I would be using my local sqlite db for this or a playground on this for testing. I am not going to use the browser-based playground attached in the databaseschool.com app for a reason. I want to use SQLite. The database is some form of Postgres; I don&#39;t mind using it, but I want to do it in SQLite.&#xA;&#xA;I have a playground on my blog for SQLite, you can try it out here:&#xA;&#xA;```sql&#xA;SELECT 1;&#xA;```&#xA;&#xA;It uses an embedded SQLite version (3.49.1) with sql-js as a wasm extension: [sql.js v1.13.0](https://github.com/sql-js/sql.js/releases/tag/v1.13.0)&#xA;&#xA;Back to the problem elves!&#xA;&#xA;## Setup&#xA;&#xA;This is the first day, so advent calendar usually requires some setup or preparation for the rest of the days. Luckily it&#39;s optional for you if you are doing it in the playground of database school or in PostgreSQL Database.&#xA;&#xA;We have some .sql files as input for creation and insertions of tables and rows in the database. It&#39;s for constructing the schema (tables) and populating the rows that the problem requires us to do.&#xA;&#xA;The SQL looked something simpler like:&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS wish_list CASCADE;&#xA;&#xA;CREATE TABLE wish_list (&#xA;   id          BIGSERIAL PRIMARY KEY,&#xA;   child_name  TEXT,&#xA;   raw_wish    TEXT&#xA;);&#xA;&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (1, &#39;James A.&#39;, &#39; BLUEY SUPERMARKET PLAY SET&#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (2, &#39;Sade C.&#39;, &#39;lego star wars set &#39;);&#xA;```&#xA;&#xA;There are around 499,000 rows!&#xA;&#xA;However when I tried to read directly into a SQLite shell as:&#xA;&#xA;```&#xA;.read day1-wish-list.sql&#xA;```&#xA;&#xA;It got an error for parsing the DROP TABLE statement:&#xA;&#xA;```&#xA;sqlite&gt; .read day1-wish-list.sql&#xA;Parse error near line 11: near &#34;CASCADE&#34;: syntax error&#xA;  DROP TABLE IF EXISTS wish_list CASCADE;&#xA;                       error here ---^&#xA;```&#xA;&#xA;Obviously it was designed for Postgres. It won&#39;t work in SQLite.&#xA;&#xA;SQLite is minimal. It might not have everything that PostgreSQL has, but PostgreSQL might have everything that SQLite has (maybe but not as is).&#xA;&#xA;So, we need to remove the CASCADE, which is an option to decide what to do with the related data rows when a relation is removed. In this case, it is cascading—deleting all the other related data points or records in the related tables. SQLite doesn&#39;t have options to modify the relations for the DROP TABLE statement. It has it for CREATE TABLE with the foreign key constraint.&#xA;&#xA;Now we need to remove it. It can&#39;t be in the DROP TABLE statement for SQLite database:&#xA;&#xA;```sql&#xA;DROP TABLE IF EXISTS wish_list;&#xA;```&#xA;&#xA;Now, let&#39;s check by running the queries again:&#xA;&#xA;```&#xA;.read day1-wish-list.sql&#xA;```&#xA;&#xA;That works!&#xA;&#xA;```&#xA;$ sqlite3&#xA;SQLite version 3.45.1 2024-01-30 16:01:20&#xA;Enter &#34;.help&#34; for usage hints.&#xA;Connected to a transient in-memory database.&#xA;Use &#34;.open FILENAME&#34; to reopen on a persistent database.&#xA;sqlite&gt; .read day1-wish-list_sqlite1.sql&#xA;sqlite&gt; .schema&#xA;CREATE TABLE wish_list (&#xA;   id          BIGSERIAL PRIMARY KEY,&#xA;   child_name  TEXT,&#xA;   raw_wish    TEXT&#xA;);&#xA;```&#xA;&#xA;But this looks weird:&#xA;&#xA;```sql&#xA;CREATE TABLE wish_list (&#xA;   id          BIGSERIAL PRIMARY KEY,&#xA;   child_name  TEXT,&#xA;   raw_wish    TEXT&#xA;);&#xA;```&#xA;&#xA;BIGSERIAL is not a datatype in SQLite. It might be in PostgreSQL. But does it matter? In SQLite, if the table is not STRICT, it doesn&#39;t matter what type the column is or even if it is NOT specified of any type. That&#39;s fine.&#xA;&#xA;INSERT some rows shall we?&#xA;&#xA;```sql&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (1, &#39;James A.&#39;, &#39; BLUEY SUPERMARKET PLAY SET&#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (2, &#39;Sade C.&#39;, &#39;lego star wars set &#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (3, &#39;Juan Q.&#39;, &#39;   SCOOTER &#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (4, &#39;Samir S.&#39;, &#39;   LEGO STAR WARS SET  &#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (5, &#39;Priya E.&#39;, &#39;shaved ice machine   &#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (6, &#39;Henry L.&#39;, &#39;   mini brands fill the fridge&#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (7, &#39;Ayumi C.&#39;, &#39;VR HEADSET&#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (8, &#39;Juan Y.&#39;, &#39;BARBIE DREAMHOUSE   &#39;);&#xA;INSERT INTO wish_list (id, child_name, raw_wish) VALUES (9, &#39;Priya O.&#39;, &#39;  VR HEADSET  &#39;);&#xA;```&#xA;&#xA;```sql&#xA;SELECT * FROM wish_list;&#xA;```&#xA;&#xA;&#xA;But I want to know what is the type of `wish_list.id`?&#xA;&#xA;```sql&#xA;SELECT distinct(typeof(id)) FROM wish_list;&#xA;```&#xA;&#xA;Here&#39;s the output:&#xA;&#xA;```&#xA;sqlite&gt; SELECT distinct(typeof(id)) FROM wish_list;&#xA;integer&#xA;```&#xA;&#xA;It is integer, because of type affinity, I guess. Good work SQLite.&#xA;&#xA;Looks like the data is fine.&#xA;&#xA;Now to the problem.&#xA;&#xA;## Problem&#xA;&#xA;[Link to the challenge](https://databaseschool.com/series/advent-of-sql-videos-309-text-challenge-using-the-and-get-started)&#xA;&#xA;**Challenge:** Using the wish_list table, count how many times each cleaned toy name appears, from most requested to least requested. Return the results in two columns: wish and count. Make sure the wish results have no extra leading or trailing spaces and are all lowercase.&#xA;&#xA;So, simply we need two columns:&#xA;- **wish** (the text)&#xA;- **count** (the number of times that wish is wished)&#xA;&#xA;### Subtleties&#xA;&#xA;&gt; Some children had typed extra spaces. Some wrote in ALL CAPS. Some had letters that danced between cases like playful snowflakes.&#xA;&gt; &#xA;&gt; I know there are some issues with spelling, the extra spaces, or the funny capitalization, but I just need to know what the children truly meant.&#xA;&#xA;So, we need to either use lower caps or upper caps the wish text and trim off the space.&#xA;&#xA;SCALAR FUNCTIONS!!&#xA;&#xA;### Scalar Functions&#xA;&#xA;[SQLite Core Functions](https://sqlite.org/lang_corefunc.html)&#xA;&#xA;I read through the list of around ~70 of them, most of them are kind of the same with different parameters.&#xA;&#xA;The one that I found relevant are:&#xA;- [LOWER](https://sqlite.org/lang_corefunc.html#lower)&#xA;- [TRIM](https://sqlite.org/lang_corefunc.html#trim)&#xA;&#xA;That&#39;s it, right? Convert into LOWER (or UPPER) and TRIM off the spaces.&#xA;&#xA;```sql&#xA;SELECT LOWER(TRIM(raw_wish)) FROM wish_list;&#xA;```&#xA;&#xA;Don&#39;t run just yet!&#xA;&#xA;```sql&#xA;SELECT LOWER(TRIM(raw_wish)) FROM wish_list LIMIT 100;&#xA;```&#xA;&#xA;Looks good. Now to the next step.&#xA;&#xA;### Grouping and Counting&#xA;&#xA;We need to count them i.e. to group by the wish.&#xA;&#xA;&gt; GROUP BY: What group by does is that it condenses the rows of certain column into a singular column for instance if there are 10 entries for &#34;lego star wars set&#34; adding a group by wish will create a singular entry for that wish and we can then perform operations like sum, count, average and all of that on other rows&#xA;&#xA;```sql&#xA;SELECT LOWER(TRIM(raw_wish)) AS wish, count(*) AS count&#xA;FROM wish_list&#xA;GROUP BY wish;&#xA;```&#xA;&#xA;Here we are grouping by wish because we don&#39;t want 10 entries of &#34;lego star wars set&#34; we just want one common entry to view the unique wishes.&#xA;&#xA;Also by using `COUNT(*)` we are counting the instances of each row. As I said, the multiple rows with the same wish are squished into a single row. So now we can use aggregate functions like count, sum, in our case we want to count how many instances of those particular wish are.&#xA;&#xA;&#xA;### Ordering Results&#xA;&#xA;Does that solve it? Mostly, just need the ORDER BY now.&#xA;&#xA;Because we also need to order the results:&#xA;&#xA;```sql&#xA;SELECT LOWER(TRIM(raw_wish)) AS wish, count(*) AS count&#xA;FROM wish_list&#xA;GROUP BY wish&#xA;ORDER BY count DESC;&#xA;```&#xA;&#xA;Perfect? Probably.&#xA;&#xA;&gt; What ORDER BY does is that it just determines which way the row should be aligned based on what column and how i.e. the column name and either ASC(ending) or DESC(ending).&#xA;&#xA;&#xA;Here we have ordered by count so that we can filter the most wished toy or least wished toy at the top and increasing or decreasing order of it.&#xA;&#xA;### Results&#xA;&#xA;Now with the mode table:&#xA;&#xA;```&#xA;sqlite&gt; .mode table&#xA;sqlite&gt; SELECT LOWER(TRIM(raw_wish)) as wish, count(*) as count FROM wish_list group by wish order by count desc;&#xA;+-----------------------------+-------+&#xA;|            wish             | count |&#xA;+-----------------------------+-------+&#xA;| lego city f1 car            | 32893 |&#xA;| barbie dreamhouse           | 32785 |&#xA;| nerf blaster                | 32746 |&#xA;| lego star wars set          | 32611 |&#xA;| beyblade battle arena       | 29564 |&#xA;| magna-tiles pet playhouse   | 29529 |&#xA;| bluey supermarket play set  | 26292 |&#xA;| lego friends amusement park | 25982 |&#xA;| pokemon trainer box         | 25968 |&#xA;| duplo building set          | 23005 |&#xA;| mini brands fill the fridge | 22965 |&#xA;| electric toy train set      | 22885 |&#xA;| toniebox audio player       | 19529 |&#xA;| scooter                     | 19496 |&#xA;| vr headset                  | 16468 |&#xA;| squishmallows               | 16304 |&#xA;| shaved ice machine          | 16263 |&#xA;| drone for kids              | 13151 |&#xA;| coding robot                | 13025 |&#xA;| headphones                  | 13006 |&#xA;| interactive robot dog       | 9770  |&#xA;| fidget spinner              | 3590  |&#xA;| yo-yo                       | 3565  |&#xA;| slime kit                   | 3553  |&#xA;| littlest pet shop playset   | 3543  |&#xA;| chatter telephone           | 3527  |&#xA;| fingerlings robot monkey    | 3511  |&#xA;| rubik&#39;s revolution          | 3474  |&#xA;+-----------------------------+-------+&#xA;sqlite&gt;&#xA;```&#xA;&#xA;---&#xA;&#xA;Day 1 done, moving on to day two by helping those pesky elves tomorrow. I am amazed at how stupider problems humans create with those elves as target. Just kidding, humans are elves :)&#xA;&#xA;Happy Coding :)&#xA;&#xA;Happy Squealing</content>
      <type>sqlog</type>
    </item>
  </channel>
</rss>