« Previous 1 2 3 4 Next »
The TensorFlow AI framework
Machine Schooling
Second Attempt
TensorFlow comes from Google and is therefore backed by one of the world's most powerful corporations. However, it is not Google's first foray into the world of artificial intelligence: DistBelief, which Google Brain (the deep learning artificial intelligence research team) created in 2011, was a purely proprietary platform and their first attempt at AI. Although the product was used widely internally, it was not planned for external use or even release. A number of scientists and programmers whose day-to-day business was AI and neural networks were hired and the old DistBelief code was revamped extensively.
Google published the result of those efforts in 2017 as TensorFlow, which enjoyed great popularity from the first moment. Many of today's AI use cases would be practically unthinkable without TensorFlow. Although Google can be accused of disregarding their "don't be evil" slogan in some departments, the company has undoubtedly done the world a service by publishing TensorFlow.
Written in Python
The task descriptions that are contributed by developers can be written in Python, which has proven to be helpful for the dissemination of TensorFlow. The description of a neural network is ultimately composed in a moderately complex script language, but in Python, the user only notes down the tasks. Under the hood a C++-based engine processes the tasks defined by the developer in TensorFlow. That said, the tensors and nodes are all Python objects that can be used as in any other Python application.
The Python bindings only offer an initial introduction to TensorFlow; if you want to use another language, you have a wide choice. Beneficially, TensorFlow is ultimately a framework with a defined API, for which different front ends can be written with relative ease. Developers can choose between Python, C, C++, Go, Java, JavaScript, and Swift, with further implementations from third-party vendors (e.g., C# and R). The results produced by TensorFlow are ultimately independent of the front-end language used.
Wide Hardware Base
As you know, not every type of processor is suitable for every type of workload, which has been emphasized lately when powerful graphics cards with Nvidia chipsets were unavailable for months or only at absurd prices. At that time, Bitcoin miners were literally buying up these boards by the truckload to mine digital currency. The relatively banal reason is that a GPU processes the Bitcoin mining operations far faster than a regular processor, because the GPU was optimized for similar workloads. For some workloads, though, a regular x86_64 CPU comes out clearly ahead.
TensorFlow scores points again with abstraction by its handling of different processor types, which allows you to run a corresponding workload on a certain target CPU by means of appropriate instructions. Moreover, TensorFlow can use multi-CPU environments, so you do not have to deal with the peculiarities of individual CPUs; you simply write instructions that TensorFlow then processes.
« Previous 1 2 3 4 Next »
Buy this article as PDF
(incl. VAT)
Buy ADMIN Magazine
Subscribe to our ADMIN Newsletters
Subscribe to our Linux Newsletters
Find Linux and Open Source Jobs
Most Popular
Support Our Work
ADMIN content is made possible with support from readers like you. Please consider contributing when you've found an article to be beneficial.