AI Prerequisite

From ChatGPT with some prompting
The image illustrates the complexity of AI processing and underscores the importance of the process. It begins with data collected from various sources like people, industry, nature, and at a microcosmic level, space or atoms, which is fed into an AI system. This data is processed through what is labeled as ‘Super Parallel Computing’, indicating a level of complexity that is described as ‘unexplainable’—suggesting the intricate and potentially incomprehensible nature of AI computations. However, a red ‘X’ button marked with ‘IF wrong Data/translate’ indicates the necessity to correct the data if it is incorrect or improperly translated, emphasizing that even advanced computing can lead to negative outcomes, as represented by the vague small character marked as ‘bad result’, if the input data is flawed. The term ‘WOW’ signifies the astonishing results AI can produce when functioning correctly, yet this is contingent on the quality and accuracy of the input data.

Overall, this diagram serves as a visual warning that the power of AI technology is reliant on the integrity of its data. Inaccurate data can lead to adverse outcomes, even with the use of sophisticated AI, as highlighted by the image’s contrast between the potential for amazement and the risk of poor results.


Data is

From DALL-E with some prompting
The image conveys the concept that data fundamentally stems from energy, which is harnessed and controlled to create meaningful information. It illustrates a progression from an energy symbol to a sine wave representing frequency, followed by a rectangular waveform symbolizing control, culminating in binary code blocks that represent data. The diagram encapsulates the idea that data is a form of controlled energy, systematically transformed to serve our purposes.

BGP Flow

From Gemini with some prompting
Example Presentation Script

  1. BGP Session Overview

Hello everyone. Today, we will delve into the details of the BGP session establishment process. BGP is an internet routing protocol that facilitates the exchange of routing information between different autonomous systems. Establishing a stable BGP session is critical for efficient traffic forwarding across the internet.

  1. TCP Connection Establishment

A BGP session commences with a TCP 3-way handshake on port 179. After establishing a reliable connection, the session proceeds to the Open message exchange phase to negotiate the fundamental parameters for the BGP session.

  1. Open Message Exchange and Keepalive Message Exchange

The Open message exchange establishes BGP parameters such as version, autonomous system number, and Hold Timer. Hold Timer defines the session’s inactivity timeout. Keepalive messages maintain the connection by periodically exchanging messages. If no Keepalive message is received within the Hold Time, the session terminates.

  1. Update Message Transmission and Path Selection

The core of the BGP session lies in the Update message transmission. Update messages contain new, modified, or withdrawn routing information. They include network, next hop, and path attribute information, enabling routing table updates and optimal path selection.

  1. Withdrawal Message and Loop Prevention

Obsolete routing information is announced through Withdrawal messages and subsequently removed from the routing table. AS path information prevents routing loops and allows each AS to control the exchanged routing information.

  1. Conclusion

The BGP session establishment process comprises TCP connection establishment, Open message exchange, Keepalive message exchange, Update message transmission, path selection, Withdrawal message, loop prevention, and policy enforcement. This process ensures a stable BGP session and facilitates efficient routing information exchange.

Data handing

From Gemini with some prompting
Image Interpretation: Using AI, Deep Learning, and Quantum Computing for Data Analysis to Drive Future Advancement

  1. Limitations of Data Interpretation: Human Effort and Computing

Humans have traditionally interpreted data through note-taking and computing.
However, human interpretation capabilities have reached their limits as data volumes have grown.

  1. AI, Deep Learning, and Quantum Computing: New Possibilities for Data Interpretation

Advances in Artificial Intelligence (AI), Deep Learning, and Quantum Computing technologies have opened up new possibilities for data interpretation.
These technologies can analyze data much faster and more accurately than humans.

  1. Next Step Advancement through Data Analysis

Data analysis using AI, Deep Learning, and Quantum Computing technologies can lead to advancements in the following areas:
Scientific Research: New drug development, new material development, environmental problem solving, etc.
Healthcare: Disease diagnosis and treatment, personalized healthcare services, etc.
Industry: Production efficiency improvement, new product development, new business model discovery, etc.
Society: Social problem-solving, policy decision support, safety and security reinforcement, etc.

  1. Conclusion: Data Analysis, the Key to the Future

Data analysis using AI, Deep Learning, and Quantum Computing technologies will be the key driver of future development.
Through data analysis, we can create a better future.

From DALL-E with some prompting
The image conceptualizes the evolution and advancement in data processing. Initially, data understandable at a human level is processed using traditional note-taking and computing methods. This is represented by a simple binary code and is processed using tools symbolized by the human brain and computers.

However, the emergence of larger and more complex datasets necessitated new tools that extend beyond human processing capabilities. As illustrated at the bottom of the image, an extended binary code represents the data that must be handled by advanced tools based on Artificial Intelligence (AI), Deep Learning, and Quantum Physics. This data format is interpretable by machines, specifically AI created by humans.

These advanced tools enable the meaningful analysis of complex data, leading to the “Analysis & Development” stage depicted on the right side of the image. This signifies how insights derived from the processing are used to solve real-world problems and drive new developments.

Therefore, this image is more than a description of data processing methods; it provides an overview of how modern technologies are managing massive datasets and how the resulting analyses can lead to tangible advancements and development goals.

My own AI agent

From DALL-E with some prompting
This image appears to be a conceptual diagram of an individual’s AI agent, divided into several parts:

  1. Personal Area: There’s a user icon with arrows labeled ‘Control’ and ‘Sensing All’. This suggests the user can direct the AI agent and the AI is capable of gathering comprehensive information from its environment.
  2. Micro & Macro Infinite World: This part features illustrations that seem to represent microorganisms, plants, butterflies, etc., indicating that the AI collects data from both microscopic and macroscopic environments.
  3. Personalized Resource: The icon resembling a human brain could represent personalized services or data tailored to the user.
  4. Cloud Infra: The cloud infrastructure is presumably responsible for data processing and storage.
  5. Cloud Service: Depicted as a server providing various services, connected to the cloud infrastructure.
  6. Internet Connected: A globe icon with various network points suggests that the AI agent is connected to global information and knowledge via the internet.

Overall, the diagram illustrates a personalized AI agent that collects information under the user’s control, processes it through cloud infrastructure and services, and ultimately contributes to collective intelligence through an internet connection.