Math plays a huge role in computer science. One example of this is in algorithms. Algorithms are mathematical equations for solving problems. For example the math behind encrypting. This algorithm takes a set of inputs from a user and sends a secure and encrypted version to another as explained in The 9 Algorithms That Changed The Future (chapter 4). Algorithms also determine the speed of programs and tasks that a computer tries to complete. A more optimized algorithm will result in faster operations so the creation of better algorithms is very important to computer scientists. Another connection between computer science and math is in the area of parallel computing. Parallel computing is a process that preforms multiple equations at the same (timehttps://en.wikipedia.org/wiki/Theoretical_computer_science). This process helps to increase the speed of programs and computers because the process of splitting up large equations and dividing up small equations between processors allows the processors to move on to the next task faster. But this speed also can result in some interesting bugs like the race condition that we demonstrated in scratch. A race condition is when the timing and sequence of the program causes a operation to be preformed more times than desired for example. This can result in incorrect outputs. Amazingly enough is that the maximum speed-up that a program can receive from using parallel computing is found through another formula called Amdahl's law.

Works cited
"Theoretical Computer Science." Wikipedia. Wikimedia Foundation, n.d. Web. 14 Nov. 2016. <https://en.wikipedia.org/wiki/Theoretical_computer_science>.
Hillis, W. Daniel. The Pattern on the Stone: The Simple Ideas That Make Computers Work. New York: Basic, 1998. Print.
MacCormick, John. Nine Algorithms That Changed the Future: The Ingenious Ideas That Drive Today's Computers. Princeton: Princeton UP, 2012. Print.
No comments:
Post a Comment