RIGHTWARDS TRIANGLE-HEADED ARROW WITH DOUBLE HORIZONTAL STROKE·U+2B7C

Character Information

Code Point
U+2B7C
HEX
2B7C
Unicode Plane
Basic Multilingual Plane
Category
Other Symbol

Character Representations

Click elements to copy
EncodingHexBinary
UTF8
E2 AD BC
11100010 10101101 10111100
UTF16 (big Endian)
2B 7C
00101011 01111100
UTF16 (little Endian)
7C 2B
01111100 00101011
UTF32 (big Endian)
00 00 2B 7C
00000000 00000000 00101011 01111100
UTF32 (little Endian)
7C 2B 00 00
01111100 00101011 00000000 00000000
HTML Entity
⭼
URI Encoded
%E2%AD%BC

Description

The Unicode character U+2B7C is a typographical symbol known as the "RIGHTWARDS TRIANGLE-HEADED ARROW WITH DOUBLE HORIZONTAL STROKE." It plays a significant role in digital text, specifically in programming and coding languages. In these contexts, it is often used to indicate the direction of movement for an algorithm or code sequence. This character can be found within source code, particularly in languages like Unicode itself, where it serves as a vital part of the programming process. Its distinctive shape, featuring a triangle head on an arrow and two horizontal strokes, sets it apart from other directional symbols. While its usage is primarily technical, understanding this character is essential for those working in computer science or digital text manipulation.

How to type the symbol on Windows

Hold Alt and type 11132 on the numpad. Or use Character Map.

  1. Step 1: Determine the UTF-8 encoding bit layout

    The character has the Unicode code point U+2B7C. In UTF-8, it is encoded using 3 bytes because its codepoint is in the range of 0x0800 to 0xffff.

    Therefore we know that the UTF-8 encoding will be done over 16 bits within the final 24 bits and that it will have the format: 1110xxxx 10xxxxxx 10xxxxxx
    Where the x are the payload bits.

    UTF-8 Encoding bit layout by codepoint range
    Codepoint RangeBytesBit patternPayload length
    U+0000 - U+007F10xxxxxxx7 bits
    U+0080 - U+07FF2110xxxxx 10xxxxxx11 bits
    U+0800 - U+FFFF31110xxxx 10xxxxxx 10xxxxxx16 bits
    U+10000 - U+10FFFF411110xxx 10xxxxxx 10xxxxxx 10xxxxxx21 bits
  2. Step 2: Obtain the payload bits:

    Convert the hexadecimal code point U+2B7C to binary: 00101011 01111100. Those are the payload bits.

  3. Step 3: Fill in the bits to match the bit pattern:

    Obtain the final bytes by arranging the paylod bits to match the bit layout:
    11100010 10101101 10111100