Purpose: This session is to show you how to build a web-based tool to let user select classification approaches for the Iris dataset. We are going to include SVM, K-mean and Hierarchical Clustering in this version and will add more in the future.

Dataset Introduction: Link

User Interface:

  1. The default setting for dropdown is “SVM” and it is showing how the model classify the item based on petal length and petal width. The below table is to show the accuracy of the classification. If the real column does not equal predict column, it means the model misclassify those individual items.


We had the sessions to talk about how to run Python to drive Macro in Access (Link) and how to set up task scheduler (Link). Today we will show how to send over the email notification (with attachment) once your scheduler job is complete.

Create a py file (send_email.py) to include the scripts to handle functionalities of sending email out, which can be used by multiple reports.

1. Define functions for start and end of SMTP server. In function of Start_SMTP_Server, user id and passcode will be required to log in. …


此篇Josh以淺顯易懂的範例介紹其中一種評量各種不同Model優劣的方法-ROC&AUC。

首先,Josh以老鼠體重與是否過胖的資料來說明,以下圖為例,藍色的資料為標柱為肥胖的老鼠,紅色則是為非肥胖的老鼠。橫軸則是,每隻老鼠的體重資料。

根據這些資料建立邏輯回歸模型,依據老鼠的體重來預測一隻老鼠是肥胖的機率。


Purpose

We are trying to search and grab the laptop information in the Amazon website by rendering through the browser via Selenium so that we can easily play around with the data. We originally tried to leverage Scrapy but it does not allow us to go this way.

Tool

  • Python package: selenium/ BeautifulSoup/random/ time/ os

Target Website


Purpose

Search and grab the laptop information in the Staples website via Scrapy and store the data into the SQLite so that we can easily play around with the data.

Tool

  • Python package: scrapy/ sqlite3/json
  • SQLite

Target Website

Steps

  1. Let’s set up the Scrapy environment first.
  • You can refer to my another post to set up the environment. LINK

2. Go through the structure of the website to locate where the target elements are.

  • When you inspect the website, you can pick price of an example and look it up across the source codes. …

SQLite is a C-language library that implements a small, fast, self-contained, high-reliability, full-featured, SQL database engine. SQLite is the most used database engine in the world. We are going to discuss how to leverage Python to upload the data in the json file, MS Excel and MS Access into Teradata.

json file into SQLite

  • Required package: sqlite3/ json
  • Key Parts:
  1. Read the data of the json file, define columns, and append values.
  2. Define connection of SQLite.
  3. Create table in the database of SQLite (if it does not exist).
  4. Insert data into target table in SQLite DB.
  • Full Py code
import…

Purpose

Search and grab the laptop information in the newegg website via Scrapy and store the data into the SQLite so that we can easily play around with the data.

See the full version LINK.

Tool

  • Python package: scrapy/ sqlite3/json
  • SQLite

Target Website

Steps

  1. Let’s set up the Scrapy environment first.
  • You can refer to my another post to set up the environment. LINK

2. Go through the structure of the website to locate where the target elements are.

  • When you inspect the website, you can pick price of an example and look it up across the source codes. …

Purpose

Search and grab the laptop information in the B&H website via Scrapy and store the data into the SQLite so that we can easily play around with the data. See the full version LINK.

Tool

  • Python package: scrapy/ json/ sqlite3
  • SQLite

Target Website

Steps

  1. Let’s set up the Scrapy environment first.
  • Go to command prompt and type “scrapy startproject bandh” where “bandh” is the folder name created for this project. After that, you will find a new folder called “bandh” created with a couple of py files.

Purpose

Search and grab the laptop information in the newegg website via Scrapy and store the data into the SQLite so that we can easily play around with the data.

Free Version

Tool

  • Python package: scrapy/ sqlite3/json
  • SQLite

Target Website

Steps

  1. Let’s set up the Scrapy environment first.
  • You can refer to my another post to set up the environment. LINK

2. Go through the structure of the website to locate where the target elements are.

  • When you inspect the website, you can pick price of an example and look it up across the source codes. …

Purpose: You may have a report published on the regular basis with lots of formatting process. This session will tell you how to leverage VBA to automate the formatting part to save you lots of time.

Free Version LINK

Input: I took a product class-level profitability report as an example.

Here is the structure of raw data:

1. Data stored in worksheet “Raw_Data”.
2. 10 columns including Business unit info, Division, Department, Class hierarchy info and sales, COGS and class-level margin information.
3. This is just a sample data. …

Informula

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store