Quick Installation Guide
- Step 1: Install Docker
https://www.docker-cn.com/community-edition#/download
Then configure the official Chinese mirror.
- Step 2: Set up TensorFlow environment
$ docker run -it -p 8888:8888 tensorflow/tensorflow
Running this command will automatically download the TensorFlow image, provided that the repository mirror is set to a Chinese mirror; otherwise, the download will be very slow. After running the command, the terminal will display a URL that prompts you to open a web page. When you open this URL, you’ll see the TensorFlow Jupyter editing environment, where we’ll input all our code.
- Mounting Docker file directory
If we need to access local files, we need to mount a local folder to the container directory. Close the container, reopen it, and use
-v host_directory:container_directory
for mounting.docker run -v /Users/hahaha/tensorflow/:/notebooks -it -p 8888:8888 tensorflow/tensorflow
Where /Users/hahaha/tensorflow/ is a folder on my Mac, and notebooks is the default Jupyter editing directory in the TensorFlow container.
Running Hello World Code
Create a new Python 2 Jupyter file, enter the following code, and then click the play button. At this point, a “Hello, TensorFlow!” string should appear below, indicating that the program has run successfully.
Program Explanation
From this simple code, we can see that TensorFlow is very easy to use. It’s imported as a standard Python library without requiring additional services to be started. For those new to TensorFlow, you might wonder why we need to use tf.constant() and tf.Session() to output a “Hello World” string when Python itself could do it. The reason is that TensorFlow defines and runs models and training through Graphs and Sessions, which provides significant benefits for complex models and distributed training.
First, in TensorFlow, there are two concepts: Graph and Operation. Operation represents what needs to be computed. A Graph contains many Operations. A Session is used to execute Operations in a Graph.
Basic Usage
When using TensorFlow, you must understand that TensorFlow:
- Uses a
graph
to represent computational tasks - Executes the graph in a
context
called aSession
- Represents data using
tensors
- Maintains state through
Variables
- Uses
feed
andfetch
to assign values to or retrieve data fromarbitrary operations
Overview
TensorFlow is a programming system that uses graphs to represent computational tasks. Nodes in the graph are called ops (short for operations). An op takes 0 or more Tensors, performs computations, and produces 0 or more Tensors. Each Tensor is a typed multi-dimensional array. For example, you can represent a small batch of images as a four-dimensional floating-point array with dimensions [batch, height, width, channels].
A TensorFlow graph describes the computation process. To perform computation, the graph must be launched in a session
. The session
distributes the graph’s ops to devices like CPUs or GPUs and provides methods to execute ops. After execution, these methods return the resulting tensors. In Python, the returned tensors are numpy ndarray
objects. In C and C++, the returned tensors are tensorflow::Tensor instances.