diff --git a/404.html b/404.html index 379b859..e2596c5 100644 --- a/404.html +++ b/404.html @@ -4,4 +4,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/about/index.html b/about/index.html index ba97cd4..ba1bede 100644 --- a/about/index.html +++ b/about/index.html @@ -4,4 +4,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/categories/index.html b/categories/index.html index 12b54f7..38ea517 100644 --- a/categories/index.html +++ b/categories/index.html @@ -4,4 +4,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/index.html b/index.html index 7364d78..63a452d 100644 --- a/index.html +++ b/index.html @@ -1,7 +1,7 @@ -Eric X. Liu's Personal Page
\ No newline at end of file diff --git a/index.xml b/index.xml index 0154d30..672ce94 100644 --- a/index.xml +++ b/index.xml @@ -1,4 +1,4 @@ -Eric X. Liu's Personal Page/Recent content on Eric X. Liu's Personal PageHugoenTue, 23 Sep 2025 06:20:36 +0000UniFi VLAN Migration to Zone-Based Architecture/posts/unifi-vlan-migration-to-zone-based-architecture/Mon, 22 Sep 2025 00:00:00 +0000/posts/unifi-vlan-migration-to-zone-based-architecture/<p>Embarking on a network migration to a properly segmented VLAN architecture is a rite of passage for any serious home lab or small business operator. The goal is clear: improve security and organization by separating traffic. However, the path from a flat network to a segmented one is often paved with subtle but critical configuration details that can lead to hours of frustrating troubleshooting.</p> +Eric X. Liu's Personal Page/Recent content on Eric X. Liu's Personal PageHugoenThu, 02 Oct 2025 07:22:54 +0000UniFi VLAN Migration to Zone-Based Architecture/posts/unifi-vlan-migration-to-zone-based-architecture/Mon, 22 Sep 2025 00:00:00 +0000/posts/unifi-vlan-migration-to-zone-based-architecture/<p>Embarking on a network migration to a properly segmented VLAN architecture is a rite of passage for any serious home lab or small business operator. The goal is clear: improve security and organization by separating traffic. However, the path from a flat network to a segmented one is often paved with subtle but critical configuration details that can lead to hours of frustrating troubleshooting.</p> <p>This article documents that journey. It details the pitfalls encountered, the core networking concepts that were essential to understand, and the best practices that ultimately led to a stable, secure, and logical network design built on a zone-based firewall model.</p>Quantization in LLMs/posts/quantization-in-llms/Tue, 19 Aug 2025 00:00:00 +0000/posts/quantization-in-llms/<p>The burgeoning scale of Large Language Models (LLMs) has necessitated a paradigm shift in their deployment, moving beyond full-precision floating-point arithmetic towards lower-precision representations. Quantization, the process of mapping a wide range of continuous values to a smaller, discrete set, has emerged as a critical technique to reduce model size, accelerate inference, and lower energy consumption. This article provides a technical overview of quantization theories, their application in modern LLMs, and highlights the ongoing innovations in this domain.</p>Breville Barista Pro Maintenance/posts/breville-barista-pro-maintenance/Sat, 16 Aug 2025 00:00:00 +0000/posts/breville-barista-pro-maintenance/<p>Proper maintenance is critical for the longevity and performance of a Breville Barista Pro espresso machine. Consistent cleaning not only ensures the machine functions correctly but also directly impacts the quality of the espresso produced. This guide provides a detailed, technical breakdown of the essential maintenance routines, from automated cycles to daily upkeep.</p> <h4 id="understanding-the-two-primary-maintenance-cycles"> <strong>Understanding the Two Primary Maintenance Cycles</strong> @@ -16,7 +16,7 @@ <p>The answer lies in creating a universal language—a bridge between the continuous, messy world of pixels and audio waves and the discrete, structured world of language tokens. One of the most elegant and powerful tools for building this bridge is <strong>Residual Vector Quantization (RVQ)</strong>.</p>Supabase Deep Dive: It's Not Magic, It's Just Postgres/posts/supabase-deep-dive/Sun, 03 Aug 2025 00:00:00 +0000/posts/supabase-deep-dive/<p>In the world of Backend-as-a-Service (BaaS), platforms are often treated as magic boxes. You push data in, you get data out, and you hope the magic inside scales. While this simplicity is powerful, it can obscure the underlying mechanics, leaving developers wondering what&rsquo;s really going on.</p> <p>Supabase enters this space with a radically different philosophy: <strong>transparency</strong>. It provides the convenience of a BaaS, but it’s built on the world&rsquo;s most trusted relational database: PostgreSQL. The &ldquo;magic&rdquo; isn&rsquo;t a proprietary black box; it&rsquo;s a carefully assembled suite of open-source tools that enhance Postgres, not hide it.</p>A Deep Dive into PPO for Language Models/posts/ppo-for-language-models/Sat, 02 Aug 2025 00:00:00 +0000/posts/ppo-for-language-models/<p>Large Language Models (LLMs) have demonstrated astonishing capabilities, but out-of-the-box, they are simply powerful text predictors. They don&rsquo;t inherently understand what makes a response helpful, harmless, or aligned with human values. The technique that has proven most effective at bridging this gap is Reinforcement Learning from Human Feedback (RLHF), and at its heart lies a powerful algorithm: Proximal Policy Optimization (PPO).</p> <p>You may have seen diagrams like the one below, which outlines the RLHF training process. It can look intimidating, with a web of interconnected models, losses, and data flows. -<img src="/images/ppo-for-language-models/7713bd3ecf27442e939b9190fa08165d.png" alt="S3 File"></p>Mixture-of-Experts (MoE) Models Challenges & Solutions in Practice/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/Wed, 02 Jul 2025 00:00:00 +0000/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/<p>Mixture-of-Experts (MoEs) are neural network architectures that allow different parts of the model (called &ldquo;experts&rdquo;) to specialize in different types of inputs. A &ldquo;gating network&rdquo; or &ldquo;router&rdquo; learns to dispatch each input (or &ldquo;token&rdquo;) to a subset of these experts. While powerful for scaling models, MoEs introduce several practical challenges.</p> +<img src="http://localhost:4998/attachments/image-3632d923eed983f171fba4341825273101f1fc94.png?client=default&amp;bucket=obsidian" alt="S3 File"></p>Mixture-of-Experts (MoE) Models Challenges & Solutions in Practice/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/Wed, 02 Jul 2025 00:00:00 +0000/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/<p>Mixture-of-Experts (MoEs) are neural network architectures that allow different parts of the model (called &ldquo;experts&rdquo;) to specialize in different types of inputs. A &ldquo;gating network&rdquo; or &ldquo;router&rdquo; learns to dispatch each input (or &ldquo;token&rdquo;) to a subset of these experts. While powerful for scaling models, MoEs introduce several practical challenges.</p> <h3 id="1-challenge-non-differentiability-of-routing-functions"> 1. Challenge: Non-Differentiability of Routing Functions <a class="heading-link" href="#1-challenge-non-differentiability-of-routing-functions"> diff --git a/posts/breville-barista-pro-maintenance/index.html b/posts/breville-barista-pro-maintenance/index.html index 30b4c09..0b77105 100644 --- a/posts/breville-barista-pro-maintenance/index.html +++ b/posts/breville-barista-pro-maintenance/index.html @@ -25,4 +25,4 @@ Understanding the Two Primary Maintenance Cycles Link to heading The Breville Ba 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/index.html b/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/index.html index 966f77f..406b53e 100644 --- a/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/index.html +++ b/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/index.html @@ -20,4 +20,4 @@ Our overarching philosophy is simple: isolate and change only one variable at a 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/how-rvq-teaches-llms-to-see-and-hear/index.html b/posts/how-rvq-teaches-llms-to-see-and-hear/index.html index ad6abea..f4954cc 100644 --- a/posts/how-rvq-teaches-llms-to-see-and-hear/index.html +++ b/posts/how-rvq-teaches-llms-to-see-and-hear/index.html @@ -18,4 +18,4 @@ The answer lies in creating a universal language—a bridge between the continuo 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/index.html b/posts/index.html index 5cbecff..cfc6b46 100644 --- a/posts/index.html +++ b/posts/index.html @@ -14,4 +14,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/index.xml b/posts/index.xml index e1afd69..3cadf97 100644 --- a/posts/index.xml +++ b/posts/index.xml @@ -1,4 +1,4 @@ -Posts on Eric X. Liu's Personal Page/posts/Recent content in Posts on Eric X. Liu's Personal PageHugoenTue, 23 Sep 2025 06:20:36 +0000UniFi VLAN Migration to Zone-Based Architecture/posts/unifi-vlan-migration-to-zone-based-architecture/Mon, 22 Sep 2025 00:00:00 +0000/posts/unifi-vlan-migration-to-zone-based-architecture/<p>Embarking on a network migration to a properly segmented VLAN architecture is a rite of passage for any serious home lab or small business operator. The goal is clear: improve security and organization by separating traffic. However, the path from a flat network to a segmented one is often paved with subtle but critical configuration details that can lead to hours of frustrating troubleshooting.</p> +Posts on Eric X. Liu's Personal Page/posts/Recent content in Posts on Eric X. Liu's Personal PageHugoenThu, 02 Oct 2025 07:22:54 +0000UniFi VLAN Migration to Zone-Based Architecture/posts/unifi-vlan-migration-to-zone-based-architecture/Mon, 22 Sep 2025 00:00:00 +0000/posts/unifi-vlan-migration-to-zone-based-architecture/<p>Embarking on a network migration to a properly segmented VLAN architecture is a rite of passage for any serious home lab or small business operator. The goal is clear: improve security and organization by separating traffic. However, the path from a flat network to a segmented one is often paved with subtle but critical configuration details that can lead to hours of frustrating troubleshooting.</p> <p>This article documents that journey. It details the pitfalls encountered, the core networking concepts that were essential to understand, and the best practices that ultimately led to a stable, secure, and logical network design built on a zone-based firewall model.</p>Quantization in LLMs/posts/quantization-in-llms/Tue, 19 Aug 2025 00:00:00 +0000/posts/quantization-in-llms/<p>The burgeoning scale of Large Language Models (LLMs) has necessitated a paradigm shift in their deployment, moving beyond full-precision floating-point arithmetic towards lower-precision representations. Quantization, the process of mapping a wide range of continuous values to a smaller, discrete set, has emerged as a critical technique to reduce model size, accelerate inference, and lower energy consumption. This article provides a technical overview of quantization theories, their application in modern LLMs, and highlights the ongoing innovations in this domain.</p>Breville Barista Pro Maintenance/posts/breville-barista-pro-maintenance/Sat, 16 Aug 2025 00:00:00 +0000/posts/breville-barista-pro-maintenance/<p>Proper maintenance is critical for the longevity and performance of a Breville Barista Pro espresso machine. Consistent cleaning not only ensures the machine functions correctly but also directly impacts the quality of the espresso produced. This guide provides a detailed, technical breakdown of the essential maintenance routines, from automated cycles to daily upkeep.</p> <h4 id="understanding-the-two-primary-maintenance-cycles"> <strong>Understanding the Two Primary Maintenance Cycles</strong> @@ -16,7 +16,7 @@ <p>The answer lies in creating a universal language—a bridge between the continuous, messy world of pixels and audio waves and the discrete, structured world of language tokens. One of the most elegant and powerful tools for building this bridge is <strong>Residual Vector Quantization (RVQ)</strong>.</p>Supabase Deep Dive: It's Not Magic, It's Just Postgres/posts/supabase-deep-dive/Sun, 03 Aug 2025 00:00:00 +0000/posts/supabase-deep-dive/<p>In the world of Backend-as-a-Service (BaaS), platforms are often treated as magic boxes. You push data in, you get data out, and you hope the magic inside scales. While this simplicity is powerful, it can obscure the underlying mechanics, leaving developers wondering what&rsquo;s really going on.</p> <p>Supabase enters this space with a radically different philosophy: <strong>transparency</strong>. It provides the convenience of a BaaS, but it’s built on the world&rsquo;s most trusted relational database: PostgreSQL. The &ldquo;magic&rdquo; isn&rsquo;t a proprietary black box; it&rsquo;s a carefully assembled suite of open-source tools that enhance Postgres, not hide it.</p>A Deep Dive into PPO for Language Models/posts/ppo-for-language-models/Sat, 02 Aug 2025 00:00:00 +0000/posts/ppo-for-language-models/<p>Large Language Models (LLMs) have demonstrated astonishing capabilities, but out-of-the-box, they are simply powerful text predictors. They don&rsquo;t inherently understand what makes a response helpful, harmless, or aligned with human values. The technique that has proven most effective at bridging this gap is Reinforcement Learning from Human Feedback (RLHF), and at its heart lies a powerful algorithm: Proximal Policy Optimization (PPO).</p> <p>You may have seen diagrams like the one below, which outlines the RLHF training process. It can look intimidating, with a web of interconnected models, losses, and data flows. -<img src="/images/ppo-for-language-models/7713bd3ecf27442e939b9190fa08165d.png" alt="S3 File"></p>Mixture-of-Experts (MoE) Models Challenges & Solutions in Practice/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/Wed, 02 Jul 2025 00:00:00 +0000/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/<p>Mixture-of-Experts (MoEs) are neural network architectures that allow different parts of the model (called &ldquo;experts&rdquo;) to specialize in different types of inputs. A &ldquo;gating network&rdquo; or &ldquo;router&rdquo; learns to dispatch each input (or &ldquo;token&rdquo;) to a subset of these experts. While powerful for scaling models, MoEs introduce several practical challenges.</p> +<img src="http://localhost:4998/attachments/image-3632d923eed983f171fba4341825273101f1fc94.png?client=default&amp;bucket=obsidian" alt="S3 File"></p>Mixture-of-Experts (MoE) Models Challenges & Solutions in Practice/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/Wed, 02 Jul 2025 00:00:00 +0000/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/<p>Mixture-of-Experts (MoEs) are neural network architectures that allow different parts of the model (called &ldquo;experts&rdquo;) to specialize in different types of inputs. A &ldquo;gating network&rdquo; or &ldquo;router&rdquo; learns to dispatch each input (or &ldquo;token&rdquo;) to a subset of these experts. While powerful for scaling models, MoEs introduce several practical challenges.</p> <h3 id="1-challenge-non-differentiability-of-routing-functions"> 1. Challenge: Non-Differentiability of Routing Functions <a class="heading-link" href="#1-challenge-non-differentiability-of-routing-functions"> diff --git a/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/index.html b/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/index.html index b7b6d8c..82c4c92 100644 --- a/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/index.html +++ b/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/index.html @@ -44,4 +44,4 @@ The Top-K routing mechanism, as illustrated in the provided ima 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/page/2/index.html b/posts/page/2/index.html index 8fddb88..696a9cc 100644 --- a/posts/page/2/index.html +++ b/posts/page/2/index.html @@ -6,4 +6,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/ppo-for-language-models/index.html b/posts/ppo-for-language-models/index.html index f417c35..2bedbe5 100644 --- a/posts/ppo-for-language-models/index.html +++ b/posts/ppo-for-language-models/index.html @@ -2,13 +2,13 @@ You may have seen diagrams like the one below, which outlines the RLHF training process. It can look intimidating, with a web of interconnected models, losses, and data flows. ">
\ No newline at end of file diff --git a/posts/quantization-in-llms/index.html b/posts/quantization-in-llms/index.html index b030281..bd48f6d 100644 --- a/posts/quantization-in-llms/index.html +++ b/posts/quantization-in-llms/index.html @@ -7,4 +7,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/secure-boot-dkms-and-mok-on-proxmox-debian/index.html b/posts/secure-boot-dkms-and-mok-on-proxmox-debian/index.html index 2b017f5..a9859bd 100644 --- a/posts/secure-boot-dkms-and-mok-on-proxmox-debian/index.html +++ b/posts/secure-boot-dkms-and-mok-on-proxmox-debian/index.html @@ -59,4 +59,4 @@ nvidia-smi failed to communicate with the NVIDIA driver modprobe nvidia → “K 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/supabase-deep-dive/index.html b/posts/supabase-deep-dive/index.html index b9588d2..97db3bb 100644 --- a/posts/supabase-deep-dive/index.html +++ b/posts/supabase-deep-dive/index.html @@ -90,4 +90,4 @@ Supabase enters this space with a radically different philosophy: transparency. 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/index.html b/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/index.html index 1e5d30f..5421b9c 100644 --- a/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/index.html +++ b/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/index.html @@ -30,4 +30,4 @@ But to truly understand the field, we must look at the pivotal models that explo 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/posts/transformer-s-core-mechanics/index.html b/posts/transformer-s-core-mechanics/index.html index ec13770..ee2e6e3 100644 --- a/posts/transformer-s-core-mechanics/index.html +++ b/posts/transformer-s-core-mechanics/index.html @@ -8,7 +8,7 @@ In deep learning, a “channel” can be thought of as a feature dimension. While this term is common in Convolutional Neural Networks for images (e.g., Red, Green, Blue channels), in LLMs, the analogous concept is the model’s primary embedding dimension, commonly referred to as d_model.">
\ No newline at end of file diff --git a/posts/unifi-vlan-migration-to-zone-based-architecture/index.html b/posts/unifi-vlan-migration-to-zone-based-architecture/index.html index 6725afd..8cc19f6 100644 --- a/posts/unifi-vlan-migration-to-zone-based-architecture/index.html +++ b/posts/unifi-vlan-migration-to-zone-based-architecture/index.html @@ -1,7 +1,7 @@ UniFi VLAN Migration to Zone-Based Architecture · Eric X. Liu's Personal Page
\ No newline at end of file diff --git a/posts/useful/index.html b/posts/useful/index.html index 2c52ded..e780892 100644 --- a/posts/useful/index.html +++ b/posts/useful/index.html @@ -9,4 +9,4 @@ One-minute read
  • [2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file diff --git a/sitemap.xml b/sitemap.xml index 4566578..630341a 100644 --- a/sitemap.xml +++ b/sitemap.xml @@ -1 +1 @@ -/2025-09-23T06:20:36+00:00weekly0.5/posts/2025-09-23T06:20:36+00:00weekly0.5/posts/unifi-vlan-migration-to-zone-based-architecture/2025-09-23T06:14:45+00:00weekly0.5/posts/quantization-in-llms/2025-08-20T06:02:35+00:00weekly0.5/posts/breville-barista-pro-maintenance/2025-08-20T06:04:36+00:00weekly0.5/posts/secure-boot-dkms-and-mok-on-proxmox-debian/2025-08-14T06:50:22+00:00weekly0.5/posts/how-rvq-teaches-llms-to-see-and-hear/2025-08-08T17:36:52+00:00weekly0.5/posts/supabase-deep-dive/2025-08-04T03:59:37+00:00weekly0.5/posts/ppo-for-language-models/2025-09-23T06:20:36+00:00weekly0.5/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/2025-08-03T06:02:48+00:00weekly0.5/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/2025-08-03T03:41:10+00:00weekly0.5/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/2025-08-03T04:20:20+00:00weekly0.5/posts/transformer-s-core-mechanics/2025-09-23T06:20:36+00:00weekly0.5/posts/useful/2025-08-03T08:37:28-07:00weekly0.5/about/2020-06-16T23:30:17-07:00weekly0.5/categories/weekly0.5/tags/weekly0.5 \ No newline at end of file +/2025-10-02T07:22:54+00:00weekly0.5/posts/2025-10-02T07:22:54+00:00weekly0.5/posts/unifi-vlan-migration-to-zone-based-architecture/2025-10-02T07:22:54+00:00weekly0.5/posts/quantization-in-llms/2025-08-20T06:02:35+00:00weekly0.5/posts/breville-barista-pro-maintenance/2025-08-20T06:04:36+00:00weekly0.5/posts/secure-boot-dkms-and-mok-on-proxmox-debian/2025-08-14T06:50:22+00:00weekly0.5/posts/how-rvq-teaches-llms-to-see-and-hear/2025-08-08T17:36:52+00:00weekly0.5/posts/supabase-deep-dive/2025-08-04T03:59:37+00:00weekly0.5/posts/ppo-for-language-models/2025-10-02T07:22:54+00:00weekly0.5/posts/mixture-of-experts-moe-models-challenges-solutions-in-practice/2025-08-03T06:02:48+00:00weekly0.5/posts/t5-the-transformer-that-zigged-when-others-zagged-an-architectural-deep-dive/2025-08-03T03:41:10+00:00weekly0.5/posts/espresso-theory-application-a-guide-for-the-breville-barista-pro/2025-08-03T04:20:20+00:00weekly0.5/posts/transformer-s-core-mechanics/2025-10-02T07:22:54+00:00weekly0.5/posts/useful/2025-08-03T08:37:28-07:00weekly0.5/about/2020-06-16T23:30:17-07:00weekly0.5/categories/weekly0.5/tags/weekly0.5 \ No newline at end of file diff --git a/tags/index.html b/tags/index.html index ea6d546..0692448 100644 --- a/tags/index.html +++ b/tags/index.html @@ -4,4 +4,4 @@ 2016 - 2025 Eric X. Liu -[2b2203c] \ No newline at end of file +[cc368da] \ No newline at end of file