git — To clone the GitHub repository that contains a streamlit app which is ready to be deployed.
Docker — To bundle application code, requirements and configuration together into a Docker image.
Google Cloud SDK — To deploy the app onto GCP (You don’t need this if you are already working on a virtual machine on Google Cloud.) (Follow directions from the install page — after downloading and unzipping the files, run the install file using…
In this post we will write a python script that fetches stock market data using the
yfinance package, processes the data and uploads the data into a Google BigQuery table which can be used for further analysis or visualization.
To fetch the data of a particular stock, we need the stock’s ticker symbol. Once you have the ticker symbol, you can either use
yf.Ticker() to fetch historical stock market data for a single stock or
yf.download() to fetch data for multiple tickers at the same time. Let’s fetch stock market data for stocks in the S&P 500. …
What is BigQuery?
-Cloud-based Data Warehouse
-with a Distributed SQL Query Engine that can process terabytes of data in seconds.
The traditional way to work with a data warehouse is to start with an ETL job i.e. Extract the raw data from the source, Transform the data and Load it into a data warehouse. The ETL pipeline to load data into BigQuery is typically written in Apache Beam or Apache Spark which extracts raw data (either streaming data or batch files) ,transforms this data (performs cleanup and/or aggregations) and then loads it into BigQuery.
We all know how frustrating job hunting can be. The numerous hours spent filling out applications, countless number of rejections and worse yet, recruiters who don’t even acknowledge your existence. It can be pretty overwhelming and at times you just feel like giving up everything and moving to the wilderness of Montana or maybe that’s just me !
So, what can you do to stand out from other candidates and have an edge over them, well, having an exceptional skill set and stellar experience definitely helps but otherwise creating a personal website to showcase the best of your abilities might…
Learn more about Google’s serverless, highly scalable and lightning fast Data Warehouse here : BigQuery.
In this article we will be looking at how to access and query data stored in BigQuery using R and Python.
We will be using the
bigrquery package created by Hadley Wickham which provides a very simple and easy to use interface to Google’s BigQuery API. Go ahead and install the package if you haven’t already.
#Install and load packages
Before you can start querying, you need to authorize
bigrquery so that it can access the BigQuery projects. The easiest way to do…
pip install cx_Oracle
Now that you have all the tools required. Open up python…
I recently completed a course on Coursera by Amazon Web Services called AWS Fundamentals: Going Cloud-Native, though i have worked with EC2 instances before, it was a very good introduction to all the other services available on AWS including their Database, Storage and Networking services. I now also have a better understanding of how the AWS platform works as a whole and the basic building blocks behind it.
The notes below covers the entire course and i tried to capture everything that is essential as succinctly as possible. …
In this post we will look at how to install ChromeDriver on your aws EC2 instance and get selenium up and running.
Check out my previous posts.
sudo mv chromedriver /usr/bin/chromedriverchromedriver --version
For the ChromeDriver to work we also need to install Google Chrome. After you run the first the curl command in the code block below, it might take a little while to download and install the necessary files.
So you have an EC2 instance up and running on AWS (If you don’t have it already, take a look at this post: https://email@example.com/launching-and-connecting-to-an-aws-ec2-instance-6678f660bbe6).
Now let’s see how we can setup a python environment, transfer python scripts from your local machine to the remote instance and run them.
To see everything that’s installed in your instance, type in the following commands:
sudo yum install python36
Even after installing python 3.6, running
python --version in the Putty…