Getting Started Main Features Examples

【Flowchart Mode】Basic operational procedures

2019-02-01 09:46:11
1865 views

Abstract:This tutorial demonstrates the basic operational procedures of Flowchart Mode. ScrapeStormFree Download

1. Enter the correct URL
Copy the URL you want to scrape in the browser and open the ScrapeStorm Flowchart Mode to paste the URL to create a new scraping task.

Click here to learn more about how to enter the correct URL.

2. Scrape web pages that need to be logged in to view

In the process of data scraping, we sometimes encounter web pages that need to log in to view the content. At this time, we need to use the Pre Login function to log in to the webpage and then perform normal data scraping.

Click here to learn more about how to log in to the web page.

3. How to use the components

The ScrapeStorm team turns the development scraping rules into components by visually encapsulating the complex scraping coding process. In Flowchart Mode, components are divided into behavior components and flow components. Components are the most basic elements that make up a flowchart scraping task.

Click the link to learn more about “Action Components” and “Flow Components”.

4. Set the scraping task component
The user can set the scraping task component by means of system-assisted tapping, or it can be set by manual dragging. Different scraping tasks require different components to be set up.

Click here for more application scenarios for scraping tasks.

5. Set the extraction field
After setting up the scraping task component, the user can set the fields to be extracted on the Extract Data component.

Click here to learn more about setting up the extracted fields.

6. Run Settings

After the extraction field is set, the user can set the scraping task. The user can use the system default setting or set the scraping task by himself.

Click here to learn more about how to configure the scraping task.

(1) Schedule
Ordinary users can choose to start scraping data at a fixed point in time. In addition to allowing users to select data at a fixed time, Professional Plan and above users can also continuously scrape data in a fixed period(Please keep it turned on).
Click here to learn more about Schedule.

(2) Anti-Block

Users can achieve anti-block through a variety of settings.

Click here to learn more about Anti-Block.

(3) Automatic Export
Professional Plan and above users can use Automatic Export to export data while running data. It is not necessary to wait until the end of the task to export the data, and Automatic Export with Schedule function, which can greatly save time and improve efficiency.
Click here to learn more about Automatic Export.

(4) Download images
If the user needs to scrape the image on the web page to the local, you can use the download image feature to complete this requirement.
Click here to learn more about Download Images.

(5) Speed Boost 

Premium users can use this feature to speed up the task’s scraping speed.

Click here to learn more about Speed Boost.

 

7. View the extraction results and export data
After the task is set, the user can view the extraction result and export the data.

Click here for more ways to view the results of the extraction and export the data.