# How to Represent Data for Neural Networks

Intro

All machine knowing systems use Tensors (multidimensional selections) as their main data structure. Data are saved in the kind of tensors. For neural networks, data are represented generally in the following formats;

.

All present artificial intelligence systems use tensors as their main information structure. Tensors are basic to the field. A tensor is a container for data at its core. That is almost continually numerical data. For that reason, its a container for numbers

Vectors (1D tensors): A selection of numbers is called vectors or 1D tensors. A 1D tensor has precisely one axis.
Scalars( 0D tensors): A Tensor that contains only one number called a scalar (0-dimensional tensor ). In Float64, float32 or numpy number is a scalar-tensor

.

X = np.array( [ 1,2,3,4,5,6,7,8], has 8 entries for this reason called 8 dimensional vector. An 8D tensor and 8D vector are different. An 8D vector has only one dimension along its axis, whereas an 8D tensor has 8 axes and may have any number of dimensions. Dimensionality can denote either the number of entries along a specific axis (as when it comes to our 8D vector) or the variety of axes in a tensor (such as 8D tensor).

.

A vector is an array of numbers. It is also called a 1D tensor. A 1D tensor is supposed to have just one axis

Vectors (1D tensors).

.

The following is a Numpy vector:

This is a Numpy matrix:>>> > > > x = np.array ([ [5, 78, 2, 34, 0],

.

A 5D tensor can have any number of dimensions along each axis and has five axes. Dimensionality may denote either the number of entries along a particular axis or the number of axes in a tensor. The rank of a tensor is being the number of axes.

From the first axis, the entries are called the rows. The columns are entries from the 2nd axis. In the earlier example, [5, 78, 2, 34, 0] is the very first row of x, and [5, 6, 7] is the very first column.

range ([ 12, 3, 6, 14]

.

.

Matrices (2D tensors).

>>> > > > x.ndim

.

.

A matrix is a range of vectors. It is likewise called a 2D tensor. Often described as columns and rows, a matrix has two axes. We can aesthetically understand a matrix as a rectangular grid of numbers

.

. [6, 79, 3, 35, 1],

>>> > > > x

>>> > > > x = np.array ([ 12, 3, 6, 14]

> > > x.ndim

.

.

3D and higher-dimensional tensors.

We obtain a 3D tensor, which we can visually translate as a cube of numbers if we pack such matrices in a brand-new array

.

Following is a Numpy 3D tensor:

>>> > > > x = np.array( [[ [5, 78, 2, 34, 0],

.

.

> > > x.ndim

. [6, 79, 3, 35, 1],

We can develop a 4D tensor by packing 3D tensors in an array and so on. Well typically manipulate tensors that are 0D to 4D in deep learning.

.

. [6, 79, 3, 35, 1],

[7, 80, 4, 36, 2]],

[[ 5, 78, 2, 34, 0],

.

. [6, 79, 3, 35, 1],

.

What are the key qualities of a tensor?

We can define a tensor by three crucial attributes;

Graph Neural Networks.

.

A tensors type might be float32. Tensors live in the pre-allocated, adjoining memory sectors, for that reason, string tensors dont exist in Numpy or in a lot of other libraries

.

.

.

A 3D tensor has 3 axes, and a matrix has 2 axes. This is likewise called the tensors ndim in Python libraries for instance Numpy

This is a tuple of integers. It defines how lots of dimensions the tensor has along each axis. For case, the previous matrix example has shape (3, 5), and the 3D tensor example has shape (3, 3, 5). A scalar has an empty shape, () and a vector has a shape with a single element, such as (5,)

Graph Neural Network, as to how it is called, is a neural network that can directly be used to charts. It offers a convenient method for node level, edge level, and chart level prediction tasks. There are primarily 3 kinds of chart neural networks;

.

A 5D tensor can have any number of measurements along each axis and has five axes. Dimensionality might denote either the number of entries along a particular axis or the number of axes in a tensor. The rank of a tensor is being the number of axes.

An 8D vector has just one dimension along its axis, whereas an 8D tensor has 8 axes and might have any number of dimensions. Dimensionality can denote either the number of entries along a specific axis (as in the case of our 8D vector) or the number of axes in a tensor (such as 8D tensor).

Recurrent Graph Neural Network.
Spatial Convolutional Network.
Spectral Convolutional Network.