Continuing the Implementation Journey from Part 1 of AI Collaborative Development
In the first part of this series, Beyond Prompts: A Practical Record of Building a Sophisticated Blog with AI in 10 Days, I shared the strategic approach I took when building this blog in collaboration with AI. I covered everything from setting a vision to information architecture and design implementation.
While the first part focused on the "why" and "what," this second part concentrates on the "how." I'll detail the process of trial and error as I worked with AI to solve technically complex implementation challenges.
This article tackles the boundary design between server and client in the Next.js v15 App Router environment. This was the most challenging aspect—and where I gained the most insights—when implementing MDX extensions (diagrams, equations, syntax highlighting).
Simple questions like "How do you implement MDX in Next.js?" can be adequately answered by ChatGPT. However, the value of this article lies in testing multiple implementation options, experiencing failures, and ultimately arriving at the optimal solution with clear decision criteria.
Technology Selection and Approach: Starting with Fundamentals
Choosing a Framework as the Starting Line
In web development, selecting a framework forms the foundation of everything. In 2025, Next.js stands as one of the most prominent choices for React-based web application development. While this choice might seem obvious, it represents a critical decision point.
I chose Next.js for building this blog system for the following reasons:
- Compatibility between blog and future web applications: Not just for blogging, but preserving the potential to evolve into a sophisticated web application with advanced features
- Support for both static and dynamic processing: Offering both SSG (Static Site Generation) and SSR (Server Side Rendering), making it an ideal choice for content-focused platforms
- Benefits in AI collaboration: Next.js has extensive documentation and community support, resulting in higher quality information when querying AI—this provides value beyond merely being "popular"
- Affinity with MDX: Particularly excellent integration with MDX, which combines Markdown and JSX
Next.js 15, with its introduction of the App Router, brought a new paradigm of Server Components, which proves especially significant for advanced MDX implementation.
Note for Next.js v15: In Next.js v15, dynamic route parameters were changed to Promise-based access. Specifically, direct access like params.slug
needs to be resolved with await params
. This change significantly impacts route handler implementation and requires special attention during migration.
I encountered this issue multiple times, and due to AI learning cutoff dates, automated solutions were often not suggested. The actual error message looks like this:
Route used 'params.slug'. 'params' should be awaited before using its properties.
The Truth About AI Collaborative Development: "Visual Grepping" and the Importance of Technical Skills
This project was built through a collaborative development process with AI. However, the reality differs significantly from the simplistic "AI code generation" often discussed in general discourse. I'd like to introduce an important concept here.
Contrasting "Vibe Coding" and "Visual Grepping"
"Vibe Coding" refers to an approach that relies heavily on AI to generate code. While superficially attractive, it often proves unsustainable in the long term. In contrast, what I practice, informally called "Visual Grepping," involves developers constantly monitoring AI-generated code and immediately fixing issues when they arise.
In practicing Visual Grepping, I maintain the fundamental premise that I could implement everything myself given enough time. While I'm relatively fast compared to other developers, the difference between me and AI is vast. However, what's crucial is the ability to technically evaluate AI output. I "visually scan" the code to identify and fix problems. This is the essence of "Visual Grepping."
Correlation Between Technical Skill and AI Output Quality
A critical fact has become evident: the creator's technical level directly impacts the quality of AI outputs. Specifically:
- Forming appropriate instructions: Technical understanding enables precise instructions to AI
- Output evaluation ability: The capacity to identify issues and optimization opportunities in generated code
- Providing context: Effectively communicating the position within the overall project
Even this blog's construction wasn't a simple "leave it to AI" approach, but rather an accumulative process. Each feature was carefully implemented, solving problems step by step. It's not as simple as getting everything right with a single prompt.
The "Slow and Steady Wins the Race" Development Philosophy
While "Vibe Coding" might appear attractive for short-term efficiency, investing in your technical skills with a "slow and steady wins the race" mindset is better for long-term growth and quality.
AI is merely a tool, and effectively wielding it requires the prerequisite ability to implement solutions yourself. While you can certainly learn more efficiently than my generation by consulting AI, don't succumb to the sweet temptation of "Vibe Coding."
Though it may be painful in the short term, improving your understanding and implementation skills independently is the optimal growth strategy in the long run. This is my conviction, reinforced through this blog construction project.
The Gap Between Ideal and Reality in MDX Implementation
The Ideal Approach in App Router
In Next.js App Router, Server Components have become the default. Theoretically, this enables an ideal approach where as much as possible is processed on the server, with only necessary parts handled by the client.
The advantages of this approach are clear:
- Improved initial load performance: Faster page loading through JavaScript reduction
- SEO optimization: Complete HTML generation on the server
- Effective use of server resources: Shifting computational load to the server
Particularly in MDX (Markdown with JSX) implementation, the ideal division of labor would be to perform markdown parsing, syntax highlighting, and diagram definition analysis on the server, while delegating only interactive elements to the client.
Implementation Choices in Reality
However, a significant gap exists between ideal and reality. In actual implementation, I faced the following technical constraints:
- React 19 compatibility issues: Hydration errors between Server Components and Client Components
- Library constraints: Many MDX extension libraries presuppose client-side execution
- Development experience complexity: Difficulty in properly designing server/client boundaries
Particularly problematic were hydration errors like this:
Error: Hydration failed because the initial UI does not match what was rendered on the server.
To address these challenges, compromises from the ideal approach became necessary. However, these compromises weren't mere "giving up," but rather seeking optimal solutions within realistic constraints.
Syntax Highlighting Implementation: The Shiki Case
Ideal Scenario and Design
For syntax highlighting—a core feature of technical blogs—Shiki was considered the optimal choice. Using the same highlighting engine as VSCode, this library provides high-quality syntax highlighting.
In the ideal implementation, the following approach was envisioned:
- Static generation at build time: Highlighting code with Shiki during MDX build processing
- Processing in Server Components: Server-side highlighting processing even for dynamic content
- Pure HTML/CSS output: Implementation without client-side JavaScript
In next.config.mjs
, the rehype-pretty-code plugin was configured, expecting server-side processing:
// next.config.mjs
import rehypePrettyCode from 'rehype-pretty-code';
const prettyCodeOptions = {
theme: 'github-dark',
keepBackground: true,
};
// Added to rehypePlugins array
rehypePlugins: [
rehypeKatex,
rehypeSlug,
[rehypePrettyCode, prettyCodeOptions],
],
Actual Implementation and Rationale
However, in practice, components/mdx/CodeBlock.tsx
needed to be implemented as a Client Component with the 'use client';
directive for several reasons:
- Avoiding hydration errors: Preventing output discrepancies between server and client
- Implementing interactive features: Adding functionality like code copy buttons
- Dynamic style application: Supporting features like theme switching
As a result, syntax highlighting processing became a hybrid approach combining build-time rehype plugins and Client Components.
This implementation maintained high-quality syntax highlighting while also enabling interactive elements like code copying functionality. However, it departed from the ideal of complete server-side implementation.
Lessons Learned
The important lessons learned from syntax highlighting implementation are:
- Balance between ideal and reality: Complete Server Component implementation isn't always the optimal solution, even in App Router environments
- Gradual optimization: Prioritizing stable implementation of basic functionality before performance optimization
- Clear responsibility division: The importance of clearly separating the roles of build process, server processing, and client processing
Mermaid.js Diagram Implementation: Complex Client-Side Processing
Conceptualized Approach and Theory
When implementing Mermaid.js diagrams, the ideal division of responsibilities was conceptualized as follows:
- Server Component: Parsing and validating diagram definitions (static work)
- Client Component: SVG rendering and interactive features (dynamic work)
Based on this concept, the following implementation pattern was considered:
// Server Component (MermaidChart.tsx)
import dynamic from "next/dynamic";
// Dynamically import Client Component
const ClientMermaidRenderer = dynamic(() => import("./ClientMermaidRenderer"), {
ssr: false,
});
export function MermaidChart({ children }) {
const chartDefinition = children.trim();
return (
<div className="mermaid-container">
<ClientMermaidRenderer chartDefinition={chartDefinition} />
</div>
);
}
// Client Component (ClientMermaidRenderer.tsx)
"use client";
import { useEffect, useRef } from "react";
import mermaid from "mermaid";
import { mermaidConfig } from "@/lib/mermaidConfig";
export default function ClientMermaidRenderer({ chartDefinition }) {
// Rendering logic
}
Actual Implementation and Challenges Faced
In reality, however, MermaidChart.tsx itself was implemented as a Client Component with the 'use client';
directive. The main reasons for this decision were:
- "mermaid is not defined" error: Inconsistencies with server-side rendering
- Hydration issues: Errors due to output differences between server and client
- Chart type detection logic: Need for flexible processing on the client side
Particularly challenging was chart type detection. Mermaid.js supports various chart types (flowcharts, sequence diagrams, ER diagrams, etc.), each requiring optimal display settings. Detecting and optimizing these on the server side proved technically complex.
Ultimately, I adopted an implementation that consistently processed everything from diagram definition analysis to rendering on the client side. This avoided synchronization issues between server and client, enabling stable diagram display.
Ensuring Visual Consistency
Even while transitioning to client-side implementation, ensuring visual consistency remained an important challenge. To solve this, detailed settings were made in mermaidConfig.ts
:
// lib/mermaidConfig.ts (excerpt)
export const mermaidConfig: MermaidConfig = {
theme: "base",
themeVariables: {
// Common settings
primaryColor: "#3d8fa9", // Emphasis element background
primaryTextColor: "#ffffff", // Emphasis element text
// Settings for each chart type
// ...
},
};
These settings achieved consistent design across various chart types. I focused particularly on:
- Integration of Wadan brand colors: Adjusting color settings so diagrams harmonize with blog design
- Optimization for each chart type: Visual settings tailored to the characteristics of each chart type
- Mobile responsiveness: Ensuring visibility on small screens with responsive design
KaTeX Math Display: Styling Optimization
Expected Implementation Policy
For equation display, the ideal division of labor was also expected to be:
- Server side: Equation parsing and validation
- Client side: Rendering and display adjustment
Implementation Techniques and Problem Solving
In the actual implementation, I adopted an approach centered on styling adjustments of span elements in mdx-components.tsx:
// mdx-components.tsx (excerpt)
"span": ({ className, children, ...props }: MDXComponentProps) => {
// Special styling for KaTeX equation blocks
if (className?.includes('katex-display')) {
return (
<div className="math-container" style={{
width: '100%',
overflowX: 'auto',
overflowY: 'hidden',
padding: '0.5rem 0',
textAlign: 'center',
margin: '1.5rem 0'
}}>
<span className={className} style={{ display: 'inline-block', fontSize: '1em' }} {...props}>
{children}
</span>
</div>
);
}
// For inline equations
else if (className?.includes('katex')) {
return <span className={className} style={{ fontSize: '1em' }} {...props}>{children}</span>;
}
// Normal span elements
return <span className={className || ''} {...props}>{children}</span>;
}
This implementation addressed the following challenges:
- Scroll support: Implementing horizontal scrolling to prevent long equations from overflowing
- Font size adjustment: Optimizing the visual balance between equations and normal text
- Mobile support: Ensuring readability on small screens
These techniques improved user experience while maintaining complex equation display capabilities.
Systematizing Implementation Decisions: Boundary Design Guidelines
Decision Framework for Boundary Design
From the experience with MDX extension implementation, I was able to construct a decision framework for server/client boundary design. These guidelines are applicable to other Next.js App Router development projects as well.
Based on this decision framework, I derived the following principles:
- Design Principle 1: Clear distinction between static and dynamic elements
- Design Principle 2: Consideration of balance between development efficiency and performance
- Design Principle 3: Gradual optimization approach (first ensure operation, then optimize)
Implementation Patterns in App Router Environment
In the Next.js App Router environment, the following implementation patterns proved effective:
- Full Server Pattern: Features that can be completed entirely on the server, such as static content and data fetching
- Client Pattern: Features centered on user interaction
- Hybrid Pattern: Data preparation on the server, display and interaction on the client
Appropriately selecting these patterns according to the situation leads to both efficient development and high performance.
Implementation Decisions in AI Collaborative Development
In AI collaborative development, the following points proved important:
- Parallel consideration of multiple approaches: Comparing multiple implementation methods proposed by AI
- Thorough technical evaluation: Constantly verifying the technical validity of AI-generated code
- Utilizing Memory Bank: Systematically accumulating and utilizing past judgments and insights
This enables optimal development combining AI's generative capabilities with human technical judgment.
Future Development and Optimization Strategies
Potential Expansion of Server-Side Processing
While the current implementation is the optimal solution within technical constraints, future improvements are expected in the following directions:
- Evolution of Next.js/React: Strengthening cooperation between Server Components and Client Components
- Partial hydration: Development of technology to make only necessary parts interactive
- Build-time optimization: Development of methods to complete more processing at build time
In particular, with the stabilization of React 19 and maturation of the Next.js App Router ecosystem, there's potential to migrate some currently client-side processing to the server side.
User Experience and Performance Optimization
From the perspective of ultimate user experience, the following optimizations are important:
- Speeding up initial display: Optimizing the critical rendering path
- Improving interactivity: Appropriate client processing where needed
- Ensuring accessibility: Usability across various devices and conditions
An approach of continuously improving implementation while balancing these factors is effective.
Conclusion: The True Value Brought by the Fusion of Technical Skills and AI Collaboration
Through the implementation of MDX extensions, I gained technical lessons and deep insights into AI collaborative development. These have universal value in modern software development, beyond mere blog construction.
Core Lessons in Technical Implementation
- Balance between ideal and reality: The importance of pursuing technical ideals while recognizing practical constraints
- Gradual implementation approach: The effectiveness of ensuring stable operation of basic functionality before proceeding with optimization
- Clarification of responsibility division: The value of clearly defining responsibility boundaries between server and client
In the Next.js App Router environment, it's important to use Server Components "appropriately" rather than "as much as possible." Making that judgment requires technical understanding and experience.
The Essence of AI Collaborative Development
In the AI era of development, the most valuable aspect is not "dependence on AI" but "technical independence."
AI is a tool, and the ability of engineers who wield that tool is decisively important.
While I utilized AI in building this blog, I always maintained the perspective of "how would I implement this?" Sometimes completely rewriting proposed code, sometimes partially adopting it, I always exercised technical judgment.
The "Slow and Steady Wins the Race" Development Philosophy
I want to reiterate this important point:
While it may seem like a detour in the short term, improving your own technical skills is the most efficient path in the long run. AI tools are powerful, but only developers with a technical foundation capable of independent implementation can effectively utilize them.
Not succumbing to the temptation of "Vibe Coding" and steadily honing your skills—this remains an unwavering truth even in the AI era. Ultimately, true innovation emerges when AI and human capabilities multiply together.
Through this blog construction journey, I continue to explore the future of development in collaboration with AI. It is not a path of dependence on AI, but one of intelligently utilizing AI while continuously honing one's technical skills and judgment.
"Slow and steady wins the race." This old proverb demonstrates its true value precisely in the cutting-edge AI era.