When we use our computers, what is normally displayed on our screens are letters and images that we understand as humans. For instance text on our screen are displayed with the regular alphabet that we use in our languages. But what many don’t know is that behind the screen, information is processed a little bit differently. You see, our computer doesn’t take in information the same way information is displayed to us. They have their own language, if you will, with entirely different letters and symbols and entirely different to represent the characters that we are familiar with. Like the bilingual person, who can speak a language but think in an entirely different one, computers process things differently to how it is finally presented to us.
We were finally able to translate and understand computers.
To understand how this work, we first need to dive into what computers actually do, and how they do it. Computers operate on what we call bits. Many see referenced in pop culture and other forms of media such as films like the matrix and hacking videos with visuals of rows of these numbers moving upwards in green text. When it comes down to representing the basic inner working of computers, people mostly know that it all comes down to ones and zeros. But rarely do people know why, only simply referring to the name of this calculation, binary. Binary digit or bit is the smallest form of computing operation that is possible, made up of transistors that turn on and off, corresponding to whether or not the number it represents its a 0 (off) or 1 (on). This binary system of counting is similar to the regular decimal method where each position of a number ranging from 0-9 has a value multiplied in tens (such as a 2 in the 100s position meaning 200), the binary is similar in which each position of a number, either 1 or 0, has a value multiplied by twos (in this case a 1 in the 8s position, followed by two zeros and a number in the 1s position would add up to 9). The binary system of counting is what is now the base of computing, where each different combination of digits come together to represent different things. But if we were to only represent it in its base form, nobody would use it due to how impractical it is. Thus, the next challenge was to change it to something more comprehensible for humans.
The key was ensuring there was enough of these bits to be able to represent more complex information[1]. This required for the transistors to be smaller for the circuits to consist of more wires to be able to host more of the simple on or off, yes or no, and true of false data, as well as combining them in a series to create more complex data; going from the single bit, to a byte (which is 8 bits put together). But that wasn’t enough just yet. The physical limitations that this had meant that 8 sets could only have 255 different combinations. But as time went on, the improvement of computer technology and the development of smaller transistors allowed for even more combination of numbers, like a 32 bit system allowing over four billion different combinations. Thus the only thing left to do was to translate that data only comprehensible to computers into something that normal people could understand, meaning representing these data into text, images, and even sounds. Binary itself, given enough bits, can represent all of these things - and we as human have developed many methods of converting the data in our computer into something we can all understand. One of them is the American Standard Code Information Interchange or ASCII (pronounced: “ask-ee”).
The American Standard Code for Information Interchange or ASCII is one of the ways we can interpret computer data, and convert it into something that we can understand. Since computers only understand numbers ASCII converts data from our computer, and represents it by assigning the English characters we’re familiar with, to a number from 0 to 127 (in decimal) in a single byte (or a row of 8 transistors).[2] ASCII codes are numerical representations of those binary codes.[3] Take the letter A for example. In ASCII it is represented by the number 65, which then in turn was represented in binary as 01000001 (the one in the 64th position added by the one in the 1st position is equal to 65, hence the ASCII number of 65), meaning that inside the computer, a row of 8 transistors formed into a combination equal to the letter A. The 8bit transistors allowed for 255 different combinations, far more than enough for what ASCII needed at the time. But this still wasn’t enough, as it was ineffective and limited the combinations. So again we depended on computers to be built better before finally being able to rearrange it so that a computer could recognise two rows of bytes to represent a single character, increasing the combination possibility from 255 to 65535. With this we were finally able to represent our computing into something that anyone could understand, making personal computers more accessible to everyone. With this we were able to write codes, create apps and software, social media platforms, video games, music, videos.
We were finally able to translate and understand computers.
Sources:
[2]Webopedia - ASCII
[1]Khan Academy - Binary & Data
[3]ASCII Table
Diemas Sukma Hawkins
106218021
Comments
Post a Comment