If you are interested in machine learning and deep learning in particular, or if you're curious about this field, you must have heard of TensorFlow. Let us altogether look into what TensorFlow is and what it can do.
What is TensorFlow?
TensorFlow is an end-to-end open-source machine learning library developed by Google. Whether you are an expert in this field or a beginner, TensorFlow provides a platform where you can easily create and distribute machine learning models.
Python is used to develop TensorFlow, which currently supports several programming languages, including Java, C++, JavaScript, and R.

Basic Applications of TensorFlow
-
Language Detection
-
Voice Search
-
Text Detection
-
Visual Recognition
-
Video Detection
-
Time Series
How to Install TensorFlow?
We know that TensorFlow can be installed and run on both the CPU and the GPU. If you ask which one we should prefer, I think it would be more advantageous to install TensorFlow on the GPU if your graphics card supports it. Because, when you perform the GPU installation, you will get much higher efficiency and the operations you will perform are much faster than the ones performed on the CPU.
We will be doing the TensorFlow CPU installation together. The following installation method has been prepared according to the system requirements and supported python versions published on TensorFlow official website It is assumed that the supported python version has been installed on the computers.
TensorFlow CPU Installation
We can say that it is a bit easier to install TensorFlow CPU than to install GPU, so let's start with the CPU installation.
Step 1: We open the command window of the operating system we use and write the code below.

pip install --upgrade tensorflow when we write and run this code, TensorFlow CPU installation will be completed.
Step 2: Write the code below into the command line to check the installation. If a tensor is returned, the installation has been successful.
python -c "import tensorflow as tf;print(tf.reduce_sum(tf.random.normal([1000, 1000])))"
TensorFlow CPU can be installed successfully in this way. If you want to do the GPU installation, you should first learn whether your display card is supported from this address then from the official TensorFlow website You can review the documents regarding the GPU installation.
How to Use TensorFlow?
TensorFlow is not only a technology designed for a single field, but it is also available for use in many areas. The reason why it is widely used is the presence of different TensorFlow libraries for different fields. So, with TensorFlow, you can work on projects that you will develop on mobile apps, web apps, or IoT devices.
If you want to develop TensorFlow applications for Python, C++, Java, JavaScript, and R programming languages, it will be enough to use the library belonging to the language you want to develop.
TensorFlow JavaScript Support: TensorFlow.js
Speaking of JavaScript, there is a JavaScript library developed by Google to train and use machine learning (ML) models in the browser - TensorFlow.js.
It is a library that accompanies TensorFlow, a popular machine learning library for Python.
Google has prepared demos with TensorFlow.js for those who want to develop themselves in this field and offered them to users.
Some of the demos that Google offers to its users:



With the increased use of machine learning and the popularity of JavaScript development - TensorFlow.js seems to increase its popularity even more in the near future.
TensorFlow-Lite
TensorFlow-Lite is a tool that allows models created with TensorFlow to work more efficiently on mobile devices, embedded systems, and IoT devices.
TensorFlow-Lite consists of two main units: TensorFlow-Lite Interpreter and TensorFlow-Lite Converter.

Advantages of TensorFlow-Lite
-
You can easily develop ML applications for iOS and Android devices.
-
It offers APIs for different programming languages.
-
You can easily convert TensorFlow models, optimized for mobile devices, into TensorFlow-Lite models.
-
TensorFlow-Lite enables you to run machine learning models quickly with low latency on mobile and embedded devices, therefore, you can carry out machine learning on these devices without using an external API or server. This means that your model can work offline on devices.
Many examples have been made available to users with TensorFlow-Lite. You can examine these examples from the official site of TensorFlow.