This blog is intended to simply understand how to use a web cam and use pre-trained models on a video stream for the ‘out-of-box’ inference.
Installation list :
The hyper-links has detailed instructions to installations. However I would just like to post some of the additional commands I had to run on windows10 while following the documentation. Hoping it would help when you go through the installations by yourself.
A. When in COCO API Installation section:
The command ‘pip install git+https://github.com/philferriere/cocoapi.git#subdirectory=PythonAPI’ didn’t work as…
This is an attempt to keep all the must-know of CNN or related vocabulary under one roof. Also make use of few visualization demos mentioned here.
Terms to discuss:
The convolution operation (also known as Cross-correlations) is a simple mathematical operation between input channels and filters. The resulting layer is output channel called feature map. The input channel is pixel values of an image. If image is RGB , there are 3 channels. If the image is grayscale, just one channel is present.
Here is a simple guideline to install PyTorch on Windows 10. Its equally easy to install in Linux , Mac.
I recommend having a conda environment. If you want to install a specific python version, check this page.
Name of the environment here: pytorch
conda create — name pytorch
This blog is intended just to trigger interest in you to learn Deep Learning.
Use the keywords you find here to dig deeper.
What maps the inputs of a node to its output? Activation Function. What makes them important is they are not typically linear functions.
Training a model implies to find the weights that minimizes the loss measured by a loss function. Loss is calculated at end of each epoch to guage how far or close the output of the model is to the desired output. Hence the objective is to minimize the loss as much as possible using…
CuPy is a GPU array backend that implements a subset of NumPy interface.
Every NumPy function doesn’t have CuPy equivalent. Check out the list here. However, most frequently used NumPy functions do have a CuPy equivalent.
If you already have CUDA toolkit installed, check out these commands.
For installing CUDA toolkit, here is one blog which talks about CUDA toolkit installation.
NumPy runs on CPU and thus limiting speed. In the colab notebook, you can realize the difference in time required for same operations on CuPy and NumPy.
To get started with CuPy, following operations are tracked for the time taken and how to use the syntax:
Here is my experience of getting tensorflow recognize GPU on windows 10.
Its not always smooth. However, few hints from the blog could help you get there quicker.
Hardware: GeForce RTX 2060 with Max-Q Design (All you need is a CUDA enabled card)
Before beginning any installations, check out the compatibility table here. The GPU. The combination that worked for me is: conda install tensorflow-gpu==2.1.0 with CUDA 10.1 toolkit & corresponding cuDNN
Sign up on Nvidia website.
You can host Flask apps for free on Heroku.
Check this insta-analytics web-app.
In the previous blog, we learnt to host this app locally. It’s time to publish it on Heroku to make it globally accessible.
Some of the prerequisites are: Git repository & Heroku account.
The new files that would be needed before publishing it on Heroku:
The files need to be organized as shown here:
Once you create…
The RAPIDS suite of software libraries, built on CUDA-X AI, gives you the freedom to execute end-to-end data science and analytics pipelines entirely on GPUs.
Installing rapids can be a headache as it requires few specific versions of some libraries and installing them individually can be laborious. So here we have our code which does everything for you.
It would be easy to run the following cells in google Colab.
nvidia-smi checks what GPU hardware is available.
!nvidia-smiThu Jan 14 06:49:11 2021
| NVIDIA-SMI 460.27.04 Driver Version: 418.67…
Downloading your own media for further sharing, analyzing or migrating to other accounts/social media platforms could be few reasons you would want to backup your own insta media. Insta scraping in python is getting easier every passing day.
This post would show you how you can download your insta media on your system using Instaloader. Install Instaloader as suggested here.
You can sort the posts according to highest likes and control the number of posts to be downloaded as well by simply tuning a parameter ‘X_percentage’ by choosing any value.
Refer the github link for the file experiment.py
A folder of the name same as Username would be created at the directory of experiment.py file.
If you are interested in knowing simply the posts_count & followers_count, take a look at previous blog.
Writers- Piyush Kulkarni, Vaani Bansal.