欢迎您访问程序员文章站本站旨在为大家提供分享程序员计算机编程知识!
您现在的位置是: 首页  >  IT编程

SpringBoot+WebMagic+MyBaties实现爬虫和数据入库的示例

程序员文章站 2022-03-24 07:53:34
目录新建springboot项目:2、创建cmscontentpo.java3、创建crawlermapper.java4、配置映射文件crawlermapper.xml6、创建articlepage...

webmagic是一个开源爬虫框架,本项目通过在springboot项目中使用webmagic去抓取数据,最后使用mybatis将数据入库。

本项目代码地址:articlecrawler: srpingboot+webmagic+mybaties实现爬虫和数据入库 (gitee.com)

创建数据库:

本示例中库名为article,表名为cms_content,表中包含contentid、title、date三个字段。

create table `cms_content` (
  `contentid` varchar(40) not null comment '内容id',
  `title` varchar(150) not null comment '标题',
  `date` varchar(150) not null comment '发布日期',
  primary key (`contentid`)
) engine=innodb default charset=utf8 comment='cms内容表';

新建springboot项目:

1、配置依赖pom.xml

<?xml version="1.0" encoding="utf-8"?>
<project xmlns="http://maven.apache.org/pom/4.0.0" xmlns:xsi="http://www.w3.org/2001/xmlschema-instance"
         xsi:schemalocation="http://maven.apache.org/pom/4.0.0 https://maven.apache.org/xsd/maven-4.0.0.xsd">
    <modelversion>4.0.0</modelversion>
    <parent>
        <groupid>org.springframework.boot</groupid>
        <artifactid>spring-boot-starter-parent</artifactid>
        <version>2.5.5</version>
        <relativepath/>
    </parent>
    <groupid>com.example</groupid>
    <artifactid>article</artifactid>
    <version>0.0.1-snapshot</version>
    <name>article</name>
    <description>article</description>
    <properties>
        <java.version>1.8</java.version>
        <project.build.sourceencoding>utf-8</project.build.sourceencoding>
        <maven.test.skip>true</maven.test.skip>
        <maven.compiler.plugin.version>3.8.1</maven.compiler.plugin.version>
        <maven.resources.plugin.version>3.1.0</maven.resources.plugin.version>

        <mysql.connector.version>5.1.47</mysql.connector.version>
        <druid.spring.boot.starter.version>1.1.17</druid.spring.boot.starter.version>
        <mybatis.spring.boot.starter.version>1.3.4</mybatis.spring.boot.starter.version>
        <fastjson.version>1.2.58</fastjson.version>
        <commons.lang3.version>3.9</commons.lang3.version>
        <joda.time.version>2.10.2</joda.time.version>
        <webmagic.core.version>0.7.5</webmagic.core.version>
    </properties>

    <dependencies>
        <dependency>
            <groupid>org.springframework.boot</groupid>
            <artifactid>spring-boot-starter-web</artifactid>
        </dependency>

        <dependency>
            <groupid>org.springframework.boot</groupid>
            <artifactid>spring-boot-starter-test</artifactid>
            <scope>test</scope>
        </dependency>


        <dependency>
            <groupid>org.springframework.boot</groupid>
            <artifactid>spring-boot-configuration-processor</artifactid>
            <optional>true</optional>
        </dependency>

        <dependency>
            <groupid>mysql</groupid>
            <artifactid>mysql-connector-java</artifactid>
            <version>${mysql.connector.version}</version>
        </dependency>

        <dependency>
            <groupid>com.alibaba</groupid>
            <artifactid>druid-spring-boot-starter</artifactid>
            <version>${druid.spring.boot.starter.version}</version>
        </dependency>

        <dependency>
            <groupid>org.mybatis.spring.boot</groupid>
            <artifactid>mybatis-spring-boot-starter</artifactid>
            <version>${mybatis.spring.boot.starter.version}</version>
        </dependency>

        <dependency>
            <groupid>com.alibaba</groupid>
            <artifactid>fastjson</artifactid>
            <version>${fastjson.version}</version>
        </dependency>

        <dependency>
            <groupid>org.apache.commons</groupid>
            <artifactid>commons-lang3</artifactid>
            <version>${commons.lang3.version}</version>
        </dependency>

        <dependency>
            <groupid>joda-time</groupid>
            <artifactid>joda-time</artifactid>
            <version>${joda.time.version}</version>
        </dependency>

        <dependency>
            <groupid>us.codecraft</groupid>
            <artifactid>webmagic-core</artifactid>
            <version>${webmagic.core.version}</version>
            <exclusions>
                <exclusion>
                    <groupid>org.slf4j</groupid>
                    <artifactid>slf4j-log4j12</artifactid>
                </exclusion>
            </exclusions>
        </dependency>

    </dependencies>

    <build>
        <plugins>
            <plugin>
                <groupid>org.apache.maven.plugins</groupid>
                <artifactid>maven-compiler-plugin</artifactid>
                <version>${maven.compiler.plugin.version}</version>
                <configuration>
                    <source>${java.version}</source>
                    <target>${java.version}</target>
                    <encoding>${project.build.sourceencoding}</encoding>
                </configuration>
            </plugin>

            <plugin>
                <groupid>org.apache.maven.plugins</groupid>
                <artifactid>maven-resources-plugin</artifactid>
                <version>${maven.resources.plugin.version}</version>
                <configuration>
                    <encoding>${project.build.sourceencoding}</encoding>
                </configuration>
            </plugin>

            <plugin>
                <groupid>org.springframework.boot</groupid>
                <artifactid>spring-boot-maven-plugin</artifactid>
                <configuration>
                    <fork>true</fork>
                    <addresources>true</addresources>
                </configuration>
                <executions>
                    <execution>
                        <goals>
                            <goal>repackage</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
        </plugins>
    </build>

    <repositories>
        <repository>
            <id>public</id>
            <name>aliyun nexus</name>
            <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
            <releases>
                <enabled>true</enabled>
            </releases>
        </repository>
    </repositories>

    <pluginrepositories>
        <pluginrepository>
            <id>public</id>
            <name>aliyun nexus</name>
            <url>http://maven.aliyun.com/nexus/content/groups/public/</url>
            <releases>
                <enabled>true</enabled>
            </releases>
            <snapshots>
                <enabled>false</enabled>
            </snapshots>
        </pluginrepository>
    </pluginrepositories>

</project>

2、创建cmscontentpo.java

数据实体,和表中3个字段对应。

package site.exciter.article.model;

public class cmscontentpo {
    private string contentid;

    private string title;

    private string date;

    public string getcontentid() {
        return contentid;
    }

    public void setcontentid(string contentid) {
        this.contentid = contentid;
    }

    public string gettitle() {
        return title;
    }

    public void settitle(string title) {
        this.title = title;
    }

    public string getdate() {
        return date;
    }

    public void setdate(string date) {
        this.date = date;
    }
}

3、创建crawlermapper.java

package site.exciter.article.dao;

import org.apache.ibatis.annotations.mapper;
import site.exciter.article.model.cmscontentpo;

@mapper
public interface crawlermapper {
    int addcmscontent(cmscontentpo record);
}

4、配置映射文件crawlermapper.xml

在resources下新建mapper文件夹,在mapper下创建crawlermapper.xml

<?xml version="1.0" encoding="utf-8"?>
<!doctype mapper public "-//mybatis.org//dtd mapper 3.0//en" "http://mybatis.org/dtd/mybatis-3-mapper.dtd">
<mapper namespace="site.exciter.article.dao.crawlermapper">

    <insert id="addcmscontent" parametertype="site.exciter.article.model.cmscontentpo">
        insert into cms_content (contentid,
        title,
        date)
        values (#{contentid,jdbctype=varchar},
        #{title,jdbctype=varchar},
        #{date,jdbctype=varchar})
    </insert>
</mapper>

5、配置application.properties

配置数据库和mybatis映射关系。

# mysql
spring.datasource.name=mysql
spring.datasource.type=com.alibaba.druid.pool.druiddatasource
spring.datasource.driver-class-name=com.mysql.jdbc.driver
spring.datasource.url=jdbc:mysql://10.201.61.184:3306/article?useunicode=true&characterencoding=utf8&usessl=false&allowmultiqueries=true
spring.datasource.username=root
spring.datasource.password=root

# druid
spring.datasource.druid.initial-size=5
spring.datasource.druid.min-idle=5
spring.datasource.druid.max-active=10
spring.datasource.druid.max-wait=60000
spring.datasource.druid.validation-query=select 1 from dual
spring.datasource.druid.test-on-borrow=false
spring.datasource.druid.test-on-return=false
spring.datasource.druid.test-while-idle=true
spring.datasource.druid.time-between-eviction-runs-millis=60000
spring.datasource.druid.min-evictable-idle-time-millis=300000
spring.datasource.druid.max-evictable-idle-time-millis=600000

# mybatis
mybatis.mapperlocations=classpath:mapper/crawlermapper.xml

6、创建articlepageprocessor.java

解析html的逻辑。

package site.exciter.article;

import org.springframework.stereotype.component;
import us.codecraft.webmagic.page;
import us.codecraft.webmagic.site;
import us.codecraft.webmagic.processor.pageprocessor;
import us.codecraft.webmagic.selector.selectable;

@component
public class articlepageprocessor implements pageprocessor {

    private site site = site.me().setretrytimes(3).setsleeptime(1000);

    @override
    public void process(page page) {
        string detail_urls_xpath = "//*[@class='posttitle']/a[@class='posttitle2']/@href";
        string next_page_xpath = "//*[@id='nav_next_page']/a/@href";
        string next_page_css = "#homepage_top_pager > div:nth-child(1) > a:nth-child(7)";
        string title_xpath = "//h1[@class='posttitle']/a/span/text()";
        string date_xpath = "//span[@id='post-date']/text()";
        page.putfield("title", page.gethtml().xpath(title_xpath).tostring());
        if (page.getresultitems().get("title") == null) {
            page.setskip(true);
        }
        page.putfield("date", page.gethtml().xpath(date_xpath).tostring());

        if (page.gethtml().xpath(detail_urls_xpath).match()) {
            selectable detailurls = page.gethtml().xpath(detail_urls_xpath);
            page.addtargetrequests(detailurls.all());
        }

        if (page.gethtml().xpath(next_page_xpath).match()) {
            selectable nextpageurl = page.gethtml().xpath(next_page_xpath);
            page.addtargetrequests(nextpageurl.all());

        } else if (page.gethtml().css(next_page_css).match()) {
            selectable nextpageurl = page.gethtml().css(next_page_css).links();
            page.addtargetrequests(nextpageurl.all());
        }
    }

    @override
    public site getsite() {
        return site;
    }
}

7、创建articlepipeline.java

处理数据的持久化。

package site.exciter.article;

import org.slf4j.logger;
import org.slf4j.loggerfactory;
import org.springframework.beans.factory.annotation.autowired;
import org.springframework.stereotype.component;
import site.exciter.article.model.cmscontentpo;
import site.exciter.article.dao.crawlermapper;
import us.codecraft.webmagic.resultitems;
import us.codecraft.webmagic.task;
import us.codecraft.webmagic.pipeline.pipeline;

import java.util.uuid;

@component
public class articlepipeline implements pipeline {

    private static final logger logger = loggerfactory.getlogger(articlepipeline.class);

    @autowired
    private crawlermapper crawlermapper;

    public void process(resultitems resultitems, task task) {
        string title = resultitems.get("title");
        string date = resultitems.get("date");

        cmscontentpo contentpo = new cmscontentpo();
        contentpo.setcontentid(uuid.randomuuid().tostring());
        contentpo.settitle(title);
        contentpo.setdate(date);

        try {
            boolean success = crawlermapper.addcmscontent(contentpo) > 0;
            logger.info("保存成功:{}", title);
        } catch (exception ex) {
            logger.error("保存失败", ex);
        }
    }
}

8、创建articletask.java

执行抓取任务。

package site.exciter.article;

import org.slf4j.logger;
import org.slf4j.loggerfactory;
import org.springframework.beans.factory.annotation.autowired;
import org.springframework.stereotype.component;
import us.codecraft.webmagic.spider;

import java.util.concurrent.executors;
import java.util.concurrent.scheduledexecutorservice;
import java.util.concurrent.timeunit;

@component
public class articletask {
    private static final logger logger = loggerfactory.getlogger(articlepipeline.class);

    @autowired
    private articlepipeline articlepipeline;

    @autowired
    private articlepageprocessor articlepageprocessor;

    private scheduledexecutorservice timer = executors.newsinglethreadscheduledexecutor();

    public void crawl() {
        // 定时任务,每10分钟爬取一次
        timer.schedulewithfixeddelay(() -> {
            thread.currentthread().setname("articlecrawlerthread");

            try {
                spider.create(articlepageprocessor)
                        .addurl("http://www.cnblogs.com/dick159/default.html?page=2")
                        // 抓取到的数据存数据库
                        .addpipeline(articlepipeline)
                        // 开启5个线程抓取
                        .thread(5)
                        // 异步启动爬虫
                        .start();
            } catch (exception ex) {
                logger.error("定时抓取数据线程执行异常", ex);
            }
        }, 0, 10, timeunit.minutes);
    }
}

9、修改application

package site.exciter.article;

import org.mybatis.spring.annotation.mapperscan;
import org.springframework.beans.factory.annotation.autowired;
import org.springframework.boot.commandlinerunner;
import org.springframework.boot.springapplication;
import org.springframework.boot.autoconfigure.springbootapplication;

@springbootapplication
@mapperscan(basepackages = "site.exciter.article.interface")
public class articleapplication implements commandlinerunner {

    @autowired
    private articletask articletask;

    public static void main(string[] args) {
        springapplication.run(articleapplication.class, args);
    }

    @override
    public void run(string... args) throws exception {
        articletask.crawl();
    }
}

10、执行application,开始抓数据并入库

SpringBoot+WebMagic+MyBaties实现爬虫和数据入库的示例

SpringBoot+WebMagic+MyBaties实现爬虫和数据入库的示例

到此这篇关于srpingboot+webmagic+mybaties实现爬虫和数据入库的示例的文章就介绍到这了,更多相关srpingboot+webmagic+mybaties爬虫和数据入库内容请搜索以前的文章或继续浏览下面的相关文章希望大家以后多多支持!