TIFINAGH LETTER YAG·U+2D33

Character Information

Code Point
U+2D33
HEX
2D33
Unicode Plane
Basic Multilingual Plane
Category
Other Letter

Character Representations

Click elements to copy
EncodingHexBinary
UTF8
E2 B4 B3
11100010 10110100 10110011
UTF16 (big Endian)
2D 33
00101101 00110011
UTF16 (little Endian)
33 2D
00110011 00101101
UTF32 (big Endian)
00 00 2D 33
00000000 00000000 00101101 00110011
UTF32 (little Endian)
33 2D 00 00
00110011 00101101 00000000 00000000
HTML Entity
ⴳ
URI Encoded
%E2%B4%B3

Description

The character U+2D33, also known as TIFINAGH LETTER YAG, is a symbol used in the Tifinagh script, which is an ancient Berber alphabet that originated in North Africa. This unique letter serves a crucial role in representing the phoneme /j/ or /ʝ/, depending on the linguistic context. It is utilized primarily in digital text for purposes such as transcription of traditional languages like Tamazight and Tuareg, which employ the Tifinagh script. The Tifinagh alphabet has historical significance due to its use among Berber communities throughout North Africa and the Sahara, withstanding various periods of colonization and cultural suppression. Today, it continues to play a role in preserving these linguistic heritage, providing an essential tool for promoting cultural identity and knowledge of these ancient languages.

How to type the symbol on Windows

Hold Alt and type 11571 on the numpad. Or use Character Map.

  1. Step 1: Determine the UTF-8 encoding bit layout

    The character has the Unicode code point U+2D33. In UTF-8, it is encoded using 3 bytes because its codepoint is in the range of 0x0800 to 0xffff.

    Therefore we know that the UTF-8 encoding will be done over 16 bits within the final 24 bits and that it will have the format: 1110xxxx 10xxxxxx 10xxxxxx
    Where the x are the payload bits.

    UTF-8 Encoding bit layout by codepoint range
    Codepoint RangeBytesBit patternPayload length
    U+0000 - U+007F10xxxxxxx7 bits
    U+0080 - U+07FF2110xxxxx 10xxxxxx11 bits
    U+0800 - U+FFFF31110xxxx 10xxxxxx 10xxxxxx16 bits
    U+10000 - U+10FFFF411110xxx 10xxxxxx 10xxxxxx 10xxxxxx21 bits
  2. Step 2: Obtain the payload bits:

    Convert the hexadecimal code point U+2D33 to binary: 00101101 00110011. Those are the payload bits.

  3. Step 3: Fill in the bits to match the bit pattern:

    Obtain the final bytes by arranging the paylod bits to match the bit layout:
    11100010 10110100 10110011