DIGIT THREE·U+0033

3

Character Information

Code Point
U+0033
HEX
0033
Unicode Plane
Basic Multilingual Plane
Category
Decimal Digit Number

Character Representations

Click elements to copy
EncodingHexBinary
UTF8
33
00110011
UTF16 (big Endian)
00 33
00000000 00110011
UTF16 (little Endian)
33 00
00110011 00000000
UTF32 (big Endian)
00 00 00 33
00000000 00000000 00000000 00110011
UTF32 (little Endian)
33 00 00 00
00110011 00000000 00000000 00000000
HTML Entity
3
URI Encoded
3

Description

The Unicode character U+0033, also known as the DIGIT THREE, is a fundamental symbol in digital text. It represents the numeral "3" and plays an indispensable role in various applications such as counting, numbering, and sequencing. This character is commonly used in programming languages, mathematical equations, data representation systems, and more. Its significance transcends cultural, linguistic, and technical boundaries due to its universal applicability in numerical representations. The DIGIT THREE can be found across different alphabets and scripts, demonstrating its versatility and importance in diverse contexts. U+0033 ensures accuracy and clarity in communication by providing a standardized symbol for the digit three, contributing significantly to the global consistency of digital text. Belonging to the Basic Latin Unicode block (U+0000 - U+007F), the DIGIT THREE is part of the foundational building blocks of the Unicode system. This range includes essential characters for controlling codes, special symbols, and various other applications. Despite its historical roots in the ASCII character set, the Basic Latin Unicode block has evolved to accommodate modern needs and continues to be a vital element of digital communication today.

How to type the 3 symbol on Windows

Hold Alt and type 0051 on the numpad. Or use Character Map.

  1. Step 1: Determine the UTF-8 encoding bit layout

    The character 3 has the Unicode code point U+0033. In UTF-8, it is encoded using 1 byte because its codepoint is in the range of 0x0000 to 0x007f.

    Therefore we know that the UTF-8 encoding will be done over 7 bits within the final 8 bits and that it will have the format: 0xxxxxxx
    Where the x are the payload bits.

    UTF-8 Encoding bit layout by codepoint range
    Codepoint RangeBytesBit patternPayload length
    U+0000 - U+007F10xxxxxxx7 bits
    U+0080 - U+07FF2110xxxxx 10xxxxxx11 bits
    U+0800 - U+FFFF31110xxxx 10xxxxxx 10xxxxxx16 bits
    U+10000 - U+10FFFF411110xxx 10xxxxxx 10xxxxxx 10xxxxxx21 bits
  2. Step 2: Obtain the payload bits:

    Convert the hexadecimal code point U+0033 to binary: 00110011. Those are the payload bits.

  3. Step 3: Fill in the bits to match the bit pattern:

    Obtain the final bytes by arranging the paylod bits to match the bit layout:
    00110011