Just a quick installation procedure for tensorflow2 (tf2).
If you’re like me and still on Ubuntu 16.04 with a python 3.5 version, you might have experience that a simple pip install does not work properly?
pip install tensorflow==2.0.0-alpha0
End up by a:
2.0.0-alpha0 not found
Before you start yieling at Google and crying at your computer, just relax and read what follows. Indeed, tf2 is available through pip only if you run python 3.7 thus if you have a version of python under 3.7, you’re stuck…
No of course there is a very simple way to install tf2 alpha0.
- Go there: GPU or there No GPU
- Dowload the version corresponding to your os (Linux of course), and your python version.
- Now you have a wheel of tensorflow.
- In order to test it I advise you to do so in a virtuan environment. I personnaly use virtualenv:
virtualenv -p /usr/bin/python3.6 venv
- Install tensorflow:
pip install /home/mycomputer/Downloads/tensorflow_gpu-2.0.0-cp3.X-cp3.Xm-manylinux1_x86_64.whl
Great ! Now you’re all set to work on tensorflow2 in a virtualenv.
Don’t forget to get out of your environment once done:
More on virtual environments.
More on tf2:
This article aims at giving an overview of what a neural network is in the context of computing. Indeed, as a computer obviously contains no neurons, the goal is to demystify this concept. A lot of artificial intelligence (AI) and Machine Learning (ML) concepts are largely inspired by biology there will first have a quick introduction to what is a neuron and how does it works. Then we will define what perceptron neurons and sigmoid neurons are.
I am referencing here the websites that I really often visit in my working (and geeking) journey. I imagine that you probably know most of them but some others might be new to you. I do not reference here some “obvious” references such as stackoverflow (oups, just did it) or developer websites such as Apache, Tensorflow and so on.
Artificial intelligence (AI) is the ability for machines to reproduce human or animal capacities such as problem-solving. One of AI’s subdomain is Machine Learning (ML), whose goal is to make computers learning business rules without the need of the business knowledge but only by giving the computer the data. As part of ML methods, Deep Learning (DL) is based on data representation. The philosophy behind this concept is to mimic the brain’s processing pattern in order to define the relationship between stimuli. This has led to a layer organisation of algorithms, particularly efficient in computer vision field. In some cases, these algorithms are able to overcome human abilities.
In this article, I will try to demystify and explain in everyday words the concept of metadata and why they are so important in today’s data world.
When Audrey is asked why she is studying the blob, here is her answer:
“When he decided to study aplysia, everybody laughed at E. Kandel, and look where he is now! A Nobel prize winner who changed neurosciences. This is the same for several other studies that lead to major discoveries. So wait for 20 to 25 years and I’ll tell you if it worsed the deal!”