Purpose: This session is to show you how to build a web-based tool to let user select classification approaches for the Iris dataset. We are going to include SVM, K-mean and Hierarchical Clustering in this version and will add more in the future.
Dataset Introduction: Link
User Interface:
…
We had the sessions to talk about how to run Python to drive Macro in Access (Link) and how to set up task scheduler (Link). Today we will show how to send over the email notification (with attachment) once your scheduler job is complete.
Create a py file (send_email.py) to include the scripts to handle functionalities of sending email out, which can be used by multiple reports.
1. Define functions for start and end of SMTP server. In function of Start_SMTP_Server, user id and passcode will be required to log in. …
Purpose
We are trying to search and grab the laptop information in the Amazon website by rendering through the browser via Selenium so that we can easily play around with the data. We originally tried to leverage Scrapy but it does not allow us to go this way.
Tool
Target Website
Purpose
Search and grab the laptop information in the Staples website via Scrapy and store the data into the SQLite so that we can easily play around with the data.
Tool
Target Website
Steps
2. Go through the structure of the website to locate where the target elements are.
SQLite is a C-language library that implements a small, fast, self-contained, high-reliability, full-featured, SQL database engine. SQLite is the most used database engine in the world. We are going to discuss how to leverage Python to upload the data in the json file, MS Excel and MS Access into Teradata.
json file into SQLite
import…
Purpose
Search and grab the laptop information in the newegg website via Scrapy and store the data into the SQLite so that we can easily play around with the data.
See the full version LINK.
Tool
Target Website
Steps
2. Go through the structure of the website to locate where the target elements are.
Purpose
Search and grab the laptop information in the B&H website via Scrapy and store the data into the SQLite so that we can easily play around with the data. See the full version LINK.
Tool
Target Website
Steps
Purpose
Search and grab the laptop information in the newegg website via Scrapy and store the data into the SQLite so that we can easily play around with the data.
Tool
Target Website
Steps
2. Go through the structure of the website to locate where the target elements are.
Purpose: You may have a report published on the regular basis with lots of formatting process. This session will tell you how to leverage VBA to automate the formatting part to save you lots of time.
Input: I took a product class-level profitability report as an example.
Here is the structure of raw data:
1. Data stored in worksheet “Raw_Data”.
2. 10 columns including Business unit info, Division, Department, Class hierarchy info and sales, COGS and class-level margin information.
3. This is just a sample data. …