The process of communication sometimes brings troubles, and often we over-complicate it, increasing the entropy (we’ll soon see what that means) and generating a poor communication process. To communicate better, we need a clear mind. That reminded me of Shannon’s theory of communication. Here, I synthesize some of his ideas and connect them to our everyday communication.

Input and output

We receive inputs from many sources. It’s important to identify when new information arrives and clarify concepts so we can communicate them later. If the source has poor quality, the digestion process is tricky. Also, if we accumulate too much input without synthesizing, it can backfire.

    flowchart TB

        I(INPUT)

        a[Video]  -.-> I
        b[Book]  -.-> I
        c[Experience]  -.-> I
        d[Idea]  -.-> I


        I ==> C

        C[Digest and understand 

Feynman technique] C ==> O O(OUTPUT) O -.-> e[Speaking] O -.-> f[Writing] O -.-> g[Product]

We receive a lot of information, knowing how to synthesize, differentiate, connect, and structure those ideas is a key skill that we need to handle.

Feynman technique in a nutshell

The Feynman Technique is a four-step method of learning that emphasizes understanding by explaining complex topics in simple terms

Be descriptive

If you are telling a story—for example, hiking a mountain—describe what you saw, how your breath felt, how the ground was, how the weather was. Were you tired? Were you alone? What were you thinking at that moment?

This pulls people into your story, helps them imagine the scene, and connects your description to their own memories.

Exposition

The best way to improve is practice. I’m not a native English speaker, and this exercise of writing is crucial for me. It’s not perfect, but it exists—and that’s better than nonexistence. Sometimes it’s hard to start, but a good strategy is committing to at least one paragraph. Inspiration comes, and soon you’re immersed. Another idea is to film yourself speaking for one minute. This helps you:

    ---
    title: Communication of Information

    ---
    flowchart LR

        S[SOURCE] --message--> T[TRANSMITER]
        T --signal--> CH[CHANNEL]
        N[NOISE SOURCE] --noise--> CH
        CH --signal--> R[RECEIVER]
        R --> D[DESTINATION]
    

Source: The Mathematical Theory of Communication, Shannon, Claude E.

Shannon mentions these elements that affect the communication:

Connecting with Noise and Redundancy

Shannon showed that communication is not only about the message, but also about what happens during transmission. A simple way to think about it is:

I = R − N

I (Information received): what really arrives at the destination.

R (Redundancy): the extra repetition or structure we add to make sure the message survives distortion.

N (Noise): the interference that distorts or erases parts of the message.


Entropy is the raw potential information before noise and redundancy are applied. So in practice:

High entropy (lots of surprise in the message) needs redundancy to ensure it gets through.

Low entropy (predictable messages) can travel with little redundancy, but also carries less real information.

In practice

Some question at the time of communication for using these concepts:

    flowchart TB
        E{Is there high entropy?}
        E -- yes --> R[Increase redundancy]
        E -- no --> C[Continue communicating]

        N{Is there noise?}
        N -- yes --> R
        N -- no --> C

        R --> C
    

When is there high entropy?


Communicate well is a key skill, also help to you to think better, and contribute to others to think better too. The practice made the master, don’t be scared to practice, you can communicate better if you understand what you are doing. I was writing this imperfectly, but that is the key for getting better, nothing perfect can be perfectible.