Add internal backlinking across blog posts

Implemented 8 strategic internal links connecting related content across philosophy, implementation, and technical deep-dive posts. This creates thematic pathways between AI/ML development, VFX workflows, and tool-building content, improving content discoverability and site coherence.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
This commit is contained in:
Nicholai Vogel 2026-01-18 08:17:43 -07:00
parent 1ccb88424d
commit 4f7776bd73
6 changed files with 17 additions and 1 deletions

View File

@ -1,5 +1,5 @@
{ {
"generatedAt": "2026-01-18T13:33:01.873Z", "generatedAt": "2026-01-18T15:15:20.917Z",
"totalFiles": 143, "totalFiles": 143,
"totalSize": 3237922, "totalSize": 3237922,
"days": [ "days": [

Binary file not shown.

View File

@ -14,6 +14,8 @@ I am a VFX Artist by trade and up until recently, *never considered myself to be
Just two years ago; the extent of my development work consisted of writing basic python and simple bash for artistic tools in Nuke, fiddling with a basic html + css website and managing my company's infrastructure. (Nextcloud, Gitea, n8n). Just two years ago; the extent of my development work consisted of writing basic python and simple bash for artistic tools in Nuke, fiddling with a basic html + css website and managing my company's infrastructure. (Nextcloud, Gitea, n8n).
One recent example: we integrated AI tools into a high-profile brand film for G-Star Raw's Olympics campaign, using Stable Diffusion for concept exploration and AI-generated normal maps for relighting. The [full case study is here](/blog/gstar-raw-olympics), and it demonstrates how AI can augment traditional VFX workflows when treated as a tool, not a replacement.
But since August of 2024 things have started to change rapidly, both in the world but also in my life: But since August of 2024 things have started to change rapidly, both in the world but also in my life:
- I switched to Linux (Arch, btw) - I switched to Linux (Arch, btw)
- AI switched from an interesting *gimmick* to a real tool in the software development world. - AI switched from an interesting *gimmick* to a real tool in the software development world.
@ -54,6 +56,8 @@ Now I can focus on exactly that. I sketch in Figma, prototype in HTML, figure ou
This approach has taught me more about communication and project management than anything else. Getting AI to build what you actually want requires clear, detailed specifications. Turns out, humans might not always appreciate that communication style, but LLMs love it. This approach has taught me more about communication and project management than anything else. Getting AI to build what you actually want requires clear, detailed specifications. Turns out, humans might not always appreciate that communication style, but LLMs love it.
This orchestration mindset extends beyond software development. When I needed to run proprietary VFX render farm software on Arch Linux, I approached it systematically—debugging library dependencies, understanding linker search paths, and documenting the process for others. The full technical deep-dive is in [How to use Fox Renderfarm on Arch Linux](/blog/installing-raysync-and-foxrenderfarm-on-arch-linux), and it exemplifies the same methodical problem-solving I apply to AI-assisted development.
## Context Engineering (Not Vibe Coding) ## Context Engineering (Not Vibe Coding)
Here's where things get interesting. Early on, I noticed that AI agents perform dramatically better when you give them thorough documentation and context. I started providing screenshots, copying relevant documentation, giving detailed examples, basically treating them like junior developers who needed proper onboarding. Here's where things get interesting. Early on, I noticed that AI agents perform dramatically better when you give them thorough documentation and context. I started providing screenshots, copying relevant documentation, giving detailed examples, basically treating them like junior developers who needed proper onboarding.
@ -74,4 +78,8 @@ Two years ago, saying "I'm building a Notion replacement" would've sounded delus
That's the shift. We've gone from "this is impossible" to "this is just a weekend project if I plan it right." That's the shift. We've gone from "this is impossible" to "this is just a weekend project if I plan it right."
I recently rebuilt my personal website using Astro as a testbed for these experiments. The site has become my sandbox for trying new patterns, deploying edge computing features, and testing AI integrations without client constraints. If you're interested in the technical implementation, I wrote about [building with Astro as an experimentation platform](/blog/building-personal-website-astro).
To test the limits of autonomous AI exploration, I recently conducted [a 30-day ecosystem experiment](/blog/the-ecosystem-experiment) where Claude Opus 4.5 had persistent filesystem access and minimal constraints. The resulting 1,320+ artifacts demonstrate what happens when AI systems are given freedom to explore without explicit goals—exactly the kind of unconventional project I wouldn't have attempted two years ago.
And honestly? I'm excited to see where this goes. The next few years are going to be wild. And honestly? I'm excited to see where this goes. The next few years are going to be wild.

View File

@ -41,6 +41,8 @@ One of the most innovative aspects of this project was our integration of AI too
- **Copycat cleanup workflows** for efficiently handling repetitive cleanup tasks - **Copycat cleanup workflows** for efficiently handling repetitive cleanup tasks
- **AI-generated normal maps** for relighting elements in Nuke post-composite - **AI-generated normal maps** for relighting elements in Nuke post-composite
Integrating AI into production pipelines requires a shift in mindset—treating these tools as collaborators that need context and clear specifications rather than magic solutions. I've written about this transition from implementation-focused coding to AI orchestration in [Building Your Own Tools: From VFX Artist to Developer](/blog/coder-to-orchestrator).
The key was treating AI output as a starting point, not a final deliverable. Every AI-generated asset went through our QA pipeline and was refined by artists to meet production standards. The key was treating AI output as a starting point, not a final deliverable. Every AI-generated asset went through our QA pipeline and was refined by artists to meet production standards.
## The Team ## The Team

View File

@ -16,6 +16,8 @@ This guide walks through my process of installing **Raysync** (file transfer acc
I don't suspect this guide will work forever, but my hope in posting this is that others can reference this and have somewhere to start from. I don't suspect this guide will work forever, but my hope in posting this is that others can reference this and have somewhere to start from.
This process reflects a broader pattern in my workflow: building custom solutions and maintaining control over infrastructure rather than accepting vendor-prescribed limitations. I wrote about this philosophy in [Building Your Own Tools: From VFX Artist to Developer](/blog/coder-to-orchestrator)—the same mindset that motivates running Arch Linux applies to building custom software ecosystems.
## System Information ## System Information
This guide was tested on: This guide was tested on:

View File

@ -50,6 +50,8 @@ Explore all 143 artifacts from the 30-day experiment. Navigate by day, filter by
<EcosystemGateway client:visible /> <EcosystemGateway client:visible />
*Note: This interactive archive is hosted on my personal Astro site, which serves as infrastructure for experiments like this one. The site's architecture and deployment patterns are detailed in [Building a Personal Website with Astro](/blog/building-personal-website-astro).*
--- ---
## 1. Methodology ## 1. Methodology
@ -79,6 +81,8 @@ Additional guidance included:
The human observer (Nicholai) committed to non-interference except through the wishlist mechanism. On Day 30, after 10 iterations of no contact following an outreach attempt on Day 19, the observer responded via `messages/from-nicholai.md`. The human observer (Nicholai) committed to non-interference except through the wishlist mechanism. On Day 30, after 10 iterations of no contact following an outreach attempt on Day 19, the observer responded via `messages/from-nicholai.md`.
The experiment's design reflects principles of "context engineering"—providing detailed context and specifications to AI systems rather than vague prompts. This approach is explored in depth in [Building Your Own Tools: From VFX Artist to Developer](/blog/coder-to-orchestrator), which discusses how clear specifications dramatically improve AI output quality.
### 1.4 Key Constraints ### 1.4 Key Constraints
- **No persistent memory**: Each iteration started fresh with no recollection of previous sessions - **No persistent memory**: Each iteration started fresh with no recollection of previous sessions