Step 1: Determine the UTF-8 encoding bit layout
The character ⩣ has the Unicode code point U+2A63. In UTF-8, it is encoded using 3 bytes because its codepoint is in the range of
0x0800
to0xffff
.
Therefore we know that the UTF-8 encoding will be done over 16 bits within the final 24 bits and that it will have the format:1110xxxx 10xxxxxx 10xxxxxx
Where thex
are the payload bits.UTF-8 Encoding bit layout by codepoint range Codepoint Range Bytes Bit pattern Payload length U+0000 - U+007F 1 0xxxxxxx 7 bits U+0080 - U+07FF 2 110xxxxx 10xxxxxx 11 bits U+0800 - U+FFFF 3 1110xxxx 10xxxxxx 10xxxxxx 16 bits U+10000 - U+10FFFF 4 11110xxx 10xxxxxx 10xxxxxx 10xxxxxx 21 bits Step 2: Obtain the payload bits:
Convert the hexadecimal code point U+2A63 to binary:
00101010 01100011
. Those are the payload bits.Step 3: Fill in the bits to match the bit pattern:
Obtain the final bytes by arranging the paylod bits to match the bit layout:
11100010 10101001 10100011
LOGICAL OR WITH DOUBLE UNDERBAR·U+2A63
Character Information
Character Representations
Click elements to copyEncoding | Hex | Binary |
---|---|---|
UTF8 | E2 A9 A3 | 11100010 10101001 10100011 |
UTF16 (big Endian) | 2A 63 | 00101010 01100011 |
UTF16 (little Endian) | 63 2A | 01100011 00101010 |
UTF32 (big Endian) | 00 00 2A 63 | 00000000 00000000 00101010 01100011 |
UTF32 (little Endian) | 63 2A 00 00 | 01100011 00101010 00000000 00000000 |
Description
The Unicode character U+2A63, known as LOGICAL OR WITH DOUBLE UNDERBAR, is a mathematical symbol that holds significant importance in digital text processing. Typically used in computer programming and logic gates, this character represents the logical OR operation between two binary values. It is an essential component in the realm of binary mathematics, where it serves to evaluate and manipulate data efficiently. While LOGICAL OR WITH DOUBLE UNDERBAR may not have a direct cultural or linguistic context, its technical usage underscores its critical role in various programming languages and algorithms. This symbol's accuracy and precision make it an indispensable tool for developers working with binary systems and digital logic.
How to type the ⩣ symbol on Windows
Hold Alt and type 10851 on the numpad. Or use Character Map.