Legacy AI (Rule-based)

The image shows a diagram explaining “Legacy AI” or rule-based AI systems. The diagram is structured in three main sections:

  1. At the top: A workflow showing three steps:
    • “Analysis” (illustrated with a document and magnifying glass with charts)
    • “Prioritize” (shown as a numbered list with 1-2-3 and an upward arrow)
    • “Choose the best” (depicted with a network diagram and pointing hand)
  2. In the middle: Programming conditional statement structure:
    • “IF [ ]” section contains analysis and prioritization icons, representing the condition evaluation
    • “THEN [ ]” section includes “optimal choice” icons, representing the action to execute when the condition is true
    • “It’s Rule” label on the right indicates this is a traditional program code processing approach
  3. At the bottom: A pipeline process labeled “It’s Algorithm (Rule-based AI)” showing:
    • A series of interconnected components with arrows
    • Each component contains small icons representing analysis and rules
    • The process ends with “Serialize without duplications”

This diagram effectively illustrates the structure and workflow of traditional rule-based AI systems, demonstrating how they operate like conventional programming with IF-THEN statements. The system first analyzes data, then prioritizes information based on predefined criteria, and finally makes decisions by selecting the optimal choice according to the programmed rules. This represents the foundation of early AI approaches before the advent of modern machine learning techniques, where explicit rules rather than learned patterns guided the decision-making process.

With Claude

AI Data Center : Power Req.

This image illustrates a diagram of power requirements and management for AI data centers:

Top Section – “More Power & Control”:

  • Diverse power sources: SMR (Small Modular Reactor), Reusable Energy (wind, solar), and ESS (Energy Storage System)
  • Power control system directing electricity from these various sources to the data center through “Power Control with Grid”
  • Integrated system for reliable and sustainable power supply

Bottom Section – “Optimization”:

  • Power distribution system through transformers and power supply units
  • Central control system for power routing
  • Load Balancing and Dynamic Power Management capabilities
  • Efficient power distribution to server racks based on GPU workload
  • “More Stable” indication emphasizing system reliability

This diagram highlights the importance of diversifying reliable power sources, efficient power control, and optimized power management according to GPU workload in AI data centers.

With Claude

AI DC Changes

The evolution of AI data centers has progressed through the following stages:

  1. Legacy – The initial form of data centers, providing basic computing infrastructure.
  2. Hyperscale – Evolved into a centralized (Centric) structure with these characteristics:
    • Led by Big Tech companies (Google, Amazon, Microsoft, etc.)
    • Focused on AI model training (Learning) with massive computing power
    • Concentration of data and processing capabilities in central locations
  3. Distributed – The current evolutionary direction with these features:
    • Expansion of Edge/On-device computing
    • Shift from AI training to inference-focused operations
    • Moving from Big Tech centralization to enterprise and national data sovereignty
    • Enabling personalization for customized user services

This evolution represents a democratization of AI technology, emphasizing data sovereignty, privacy protection, and the delivery of optimized services tailored to individual users.

AI data centers have evolved from legacy systems to hyperscale centralized structures dominated by Big Tech companies focused on AI training. The current shift toward distributed architecture emphasizes edge/on-device computing, inference capabilities, data sovereignty for enterprises and nations, and enhanced personalization for end users.

with Claude

New Human Challenges

This image titled “New Human Challenges” illustrates the paradigm shift in information processing in the AI era and the new roles humans must assume.

The diagram is structured in three tiers:

  1. Human (top row): Shows the traditional human information processing flow. Humans sense information from the “World,” perform “Analysis” using the brain, and make final “Decisions” based on this analysis.
  2. By AI (middle row): In the modern technological environment, information from the world is “Digitized” into binary code, and this data is then processed through “AI/ML” systems.
  3. Human Challenges (bottom row): Highlights three key challenges humans face in the AI era:
    • “Is it accurate?” – Verifying the quality and integrity of data collection processes
    • “Is it enough?” – Ensuring the trained data is sufficient and balanced to reflect all perspectives
    • “Are you responsible?” – Reflecting on whether humans can take ultimate responsibility for decisions suggested by AI

This diagram effectively demonstrates how the information processing paradigm has shifted from human-centered to AI-assisted systems, transforming the human role from direct information processors to supervisors and accountability holders for AI systems. Humans now face new challenges focused on ensuring data quality, data sufficiency and balance, and taking responsibility for final decision-making.

With Claude

Massive increase in data

The Explosive Growth of Data – Step by Step

  1. Writing and Books
    Humans started recording information with letters and numbers in books.
    Data was physical and limited.
  2. Computers and Servers
    We began creating and storing digital data using computers and internal servers.
    Data became digitized.
  3. Personal Computers
    PCs allowed individuals to create, store, and use data on their own.
    Data became personalized and widespread.
  4. The Internet
    Data moved beyond local use and spread globally through the internet.
    Data became connected worldwide.
  5. Cloud Computing
    With cloud technology, data storage and processing expanded rapidly.
    Big data was born.
  6. AI, Deep Learning, and LLMs
    Data now powers AI. Deep learning and large language models analyze and generate human-like content.
    Everything is becoming data.
  7. Beyond – Total Bit Transformation
    In the future, we will digitize everything—from thoughts to actions—into bits.
    Data will define all reality.

As data has grown explosively, computing has evolved to process it—from early machines to AI and LLMs—and soon, everything will be digitized into data.

With ChatGPT

Data Center Challenges

This diagram illustrates “Data Center Challenges” by visually explaining the key challenges faced by data centers and their potential solutions.

The central red circle highlights the main challenges:

  • “No Error” – representing reliable operations
  • “Cost down” – representing economic efficiency
  • Between these two goals, there typically exists a “trade-off” relationship

The “Optimization” section on the right breaks down the cost structure:

  1. “Power Cost”:
    • “Working” – representing IT power that can be optimized through “Green Coding”
    • “Cooling” – can be significantly optimized with “Using water” (liquid cooling) technologies
  2. “Labor Cost”:
    • Personnel costs that can be reduced through automation

The middle “Digital Automation” section shows:

  • “by Data” decision-making approaches
  • “With AI” methodologies

At the bottom, the final outcome shows:

  • “win win” – upward arrows and “Optimization” indicating that both goals can be achieved simultaneously

This diagram demonstrates how digital automation leveraging data and AI can help data centers achieve the seemingly conflicting goals of reliable operations and cost reduction simultaneously.

With Claude