TCP/IP Better

This image is an informational diagram titled “TCP/IP and better” that explains various aspects of network protocols and optimizations.

The diagram is organized into three main sections:

  1. Connection
    • Shows “3 way Handshaking” with a visual representation of the SYN, SYN+ACK, ACK sequence
    • “Optimizing Handshake Latency” section mentions:
      • QUIC (Developed by Google, used in HTTP/3) → Supports 0-RTT handshake
      • TCP Fast Open (TFO) → Allows sending data with the first request using previous connection information
  2. Congestion Control
    • Lists “tahoe & reno” congestion control algorithms
    • Shows diagrams of Send Buffer Size with concepts like “Timeout 3-Dup-Ack” and “3-Dup Ack (Reno)”
    • “Minimizing Network Congestion & Fast Recovery” section mentions:
      • CUBIC → Less sensitive to RTT, enabling faster congestion recovery
      • BBR (Bottleneck Bandwidth and RTT) → Dynamically adjusts transmission rate based on real-time network conditions
  3. Header Remove
    • Shows TCP header structure diagram and “Optimize header” section
    • “Reducing Overhead” section mentions:
      • Compresses TCP headers in low-bandwidth networks (PPP, satellite links)
      • Uses UDP instead of TCP, eliminating the need for a TCP header

The diagram appears to be an educational resource about TCP/IP protocols and various optimizations that have been developed to improve network performance, particularly focused on connection establishment, congestion control, and overhead reduction.

With Claude

Data Quality

The image shows a data quality infographic with key dimensions that affect AI systems.

At the top of the image, there’s a header titled “Data Quality”. Below that, there are five key data quality dimensions illustrated with icons:

  • Accuracy – represented by a target with a checkmark. This is essential for AI models to produce correct results, as data with fewer errors and biases enables more accurate predictions.
  • Consistency – shown with circular arrows forming a cycle. This maintains consistent data formats and meanings across different sources and over time, enabling stable learning and inference in AI models.
  • Timeliness – depicted by a clock/pie chart with checkmarks. Providing up-to-date data in a timely manner allows AI to make decisions that accurately reflect current circumstances.
  • Resolution – illustrated with “HD” text and people icons underneath. This refers to increasing detailed accuracy through higher data density obtained by more frequent sampling per unit of time. High-resolution data allows AI to detect subtle patterns and changes, enabling more sophisticated analysis and prediction.
  • Quantity – represented by packages/boxes with a hand underneath. AI systems, particularly deep learning models, perform better when trained on large volumes of data. Sufficient data quantity allows for learning diverse patterns, preventing overfitting, and enabling recognition of rare cases or exceptions. It also improves the model’s generalization capability, ensuring reliable performance in real-world environments.

The bottom section features a light gray background with a conceptual illustration showing how these data quality dimensions contribute to AI. On the left side is a network of connected databases, devices, and information systems. An arrow points from this to a neural network representation on the right side, with the text “Data make AI” underneath.

The image appears to be explaining that these five quality dimensions are essential for creating effective AI systems, emphasizing that the quality of data directly impacts AI performance.

With Claude

LLM/RAG/Agentic

This image shows a diagram titled “LLM RAG Agentic” that illustrates the components and relationships in an AI system architecture.

The diagram is organized in a grid-like layout with three rows and three columns. Each row appears to represent different functional aspects of the system:

Top row:

  • Left: “Text QnA” in a blue box
  • Middle: A question mark icon with what looks like document/chat symbols
  • Right: “LLM” (Large Language Model) in a blue box with a brain icon connected to various data sources/APIs in the middle

Middle row:

  • Left: “Domain Specific” in a blue box
  • Middle: A “Decision by AI” circle/node that serves as a central connection point
  • Right: “RAG” (Retrieval-Augmented Generation) in a blue box with database/server icons

Bottom row:

  • Left: “Agentic & Control Automation” in a blue box
  • Middle: A task management or workflow icon with checkmarks and a clock
  • Right: “Agentic AI” in a blue box with UI/interface icons

Arrows connect these components, showing how information and processes flow between them. The diagram appears to illustrate how a large language model integrates with retrieval-augmented generation capabilities and agentic (autonomous action-taking) functionality to form a complete AI system.

With Claude

Data is the next of the AI

Data is the backbone of AI’s evolution.

Summary 🚀

  1. High-quality data is the key to the AI era.
    • Infrastructure has advanced, but accurate and structured data is essential for building effective AI models.
    • Garbage In, Garbage Out (GIGO) principle: Poor data leads to poor AI performance.
  2. Characteristics of good data
    • High-resolution data: Provides precise information.
    • Clear labeling: Enhances learning accuracy.
    • Structured data: Enables efficient AI processing.
  3. Data is AI’s core competitive advantage.
    • Domain-specific datasets define AI performance differences.
    • Data cleaning and quality management are essential.
  4. Key messages
    • “Data is the backbone of AI’s evolution.”
    • “Good data fuels great AI!”

Conclusion

AI’s success now depends on how well data is collected, processed, and managed. Companies and researchers must focus on high-quality data acquisition and refinement to stay ahead. 🚀

With ChatGPT

GPU vs NPU on Deep learning

This diagram illustrates the differences between GPU and NPU from a deep learning perspective:

GPU (Graphic Process Unit):

  • Originally developed for 3D game rendering
  • In deep learning, it’s utilized for parallel processing of vast amounts of data through complex calculations during the training process
  • Characterized by “More Computing = Bigger Memory = More Power,” requiring high computing power
  • Processes big data and vectorizes information using the “Everything to Vector” approach
  • Stores learning results in Vector Databases for future use

NPU (Neuron Process Unit):

  • Retrieves information from already trained Vector DBs or foundation models to generate answers to questions
  • This process is called “Inference”
  • While the training phase processes all data in parallel, the inference phase only searches/infers content related to specific questions to formulate answers
  • Performs parallel processing similar to how neurons function

In conclusion, GPUs are responsible for processing enormous amounts of data and storing learning results in vector form, while NPUs specialize in the inference process of generating actual answers to questions based on this stored information. This relationship can be summarized as “training creates and stores vast amounts of data, while inference utilizes this at the point of need.”

With Claude

AI in the data center

AI in the Data Center

This diagram titled “AI in the Data Center” illustrates two key transformational elements that occur when AI technology is integrated into data centers:

1. Computing Infrastructure Changes

  • AI workloads powered by GPUs become central to operations
  • Transition from traditional server infrastructure to GPU-centric computing architecture
  • Fundamental changes in data center hardware configuration and network connectivity

2. Management Infrastructure Changes

  • Increased requirements for power (“More Power!!”) and cooling (“More Cooling!!”) to support GPU infrastructure
  • Implementation of data-driven management systems utilizing AI technology
  • AI-based analytics and management for maintaining stability and improving efficiency

These two changes are interconnected, visually demonstrating how AI technology not only revolutionizes the computing capabilities of data centers but also necessitates innovation in management approaches to effectively operate these advanced systems.

with Claude