# 空气污染指标分析
**Repository Path**: worth-1/whl
## Basic Information
- **Project Name**: 空气污染指标分析
- **Description**: 本项目涵盖数据采集、数据抽取与转换、数据处理、数据分析、数据存储、数据可视化。项目在Kaggle等网站获取数据,将数据上传到HDFS中,使用Scala代码编写数据清洗和分析项目,使用spark进行数据分析,使用SpringBoot+Mybatis进项数据接口项目的快速开发,并构建数据传输对象,便于Mysql中读取据可视化。使用vue框架+echarts进行前端可视化,以图形形式展现数据。
- **Primary Language**: Java
- **License**: Not specified
- **Default Branch**: master
- **Homepage**: None
- **GVP Project**: No
## Statistics
- **Stars**: 0
- **Forks**: 3
- **Created**: 2025-03-10
- **Last Updated**: 2025-03-10
## Categories & Tags
**Categories**: Uncategorized
**Tags**: None
## README
# 空气污染指标分析

#### 介绍
**简介**
本项目由大数据第6班第8小组共同完成
#### 数据来源
https://www.kaggle.com/datasets/hanwizardhanwizard/polling 中国空气质量数据
https://www.kaggle.com/datasets/mujtabamatin/air-quality-and-pollution-assessment 空气质量和污染评估
#### 数据集
| Temperature | Humidity | PM2.5 | PM10 | NO2 | SO2 | CO | Proximity_to_Industrial_Areas | Population_Density | Air_Quality |
|---|---|---|---|---|---|---|---|---|---|
| 10~60 | 30~110 | 0~~250 | -1~300 | 0~60 | -10~50 | 0~4 | 0~20 | 100~1000 | Good,Moderate,Poor,Hazardous |
| 月份 | AQI | 范围 | 质量等级 | PM2.5 | PM10 | CO | SO2 | NO2 | O3 |
|---|---|---|---|---|---|---|---|---|---|
| 1~12月 | | | | | | | | | |
PM2.5是指空气中直径小于或等于2.5微米的颗粒物(particulate matter),这些颗粒物也被称为细颗粒物。PM2.5的直径大约是人类头发丝直径的1/20。由于它们非常细小,可以长时间悬浮在空气中,并且能够深入肺部,因此对人体健康和环境质量有显著影响。
PM2.5的主要来源包括:
燃烧过程:如工业生产、汽车尾气、燃煤发电厂等。
自然来源:如沙尘暴、火山爆发、森林火灾等。
地面扬尘:如建筑工地、道路扬尘等。
PM2.5因其微小的尺寸,可以携带有毒物质,如重金属、化学物质和细菌,对人体健康构成威胁,包括但不限于呼吸系统疾病、心血管疾病等。因此,许多国家和地区都将PM2.5的监测和控制作为环境保护和公共卫生的重要部分
#### 上传原始数据到HDFS
```
hdfs dfs -mkdir -p /pollution-data/ods
hdfs dfs -put pollution_dataset.csv /pollution-data/ods
```
#### 创建 hive 数据库
create database wanghaolin
#### 创建 mysql 数据库
create database wanghaolin
#### 工程文件
1.数据分析 pollution-spark
2.数据接口 pollution-api
3.数据可视化 pollution-view
#### 数据分析
#### POM.XML
```
4.0.0
org.whl
pollution-analysis
1.0.0
8
${jdk.version}
${jdk.version}
${jdk.version}
utf-8
utf-8
UTF-8
true
true
3.5.3
1.8.1
2.18.0
3.17.0
1.2.24
2.0.53
2.0.53
2.11.0
3.4.0
4.0.1
1.1.2
2.15.4
5.8.34
1.18.34
9.1.0
2.13.15
org.scala-lang
scala-library
${scala.version}
org.scala-lang
scala-compiler
${scala.version}
com.github.binarywang
java-testdata-generator
${java-testdata-generator.version}
org.apache.spark
spark-core_2.13
${spark.version}
org.apache.spark
spark-sql_2.13
${spark.version}
org.apache.spark
spark-streaming_2.13
${spark.version}
org.apache.spark
spark-hive_2.13
${spark.version}
org.apache.spark
spark-streaming-kafka-0-10_2.13
${spark.version}
org.projectlombok
lombok
${lombok.version}
org.apache.logging.log4j
log4j-slf4j2-impl
2.24.2
org.apache.logging.log4j
log4j-core
2.24.2
org.slf4j
slf4j-api
2.0.16
org.apache.hadoop
hadoop-client
${hadoop.version}
org.apache.hive
hive-jdbc
${hive.version}
org.apache.commons
commons-lang3
${commons-lang3.version}
commons-io
commons-io
${commons-io.version}
com.mysql
mysql-connector-j
${mysql.version}
com.fasterxml.jackson.core
jackson-core
${jackson.version}
com.fasterxml.jackson.core
jackson-annotations
${jackson.version}
com.fasterxml.jackson.core
jackson-databind
${jackson.version}
com.fasterxml.jackson.datatype
jackson-datatype-jsr310
${jackson.version}
${project.artifactId}
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
UTF-8
${jdk.version}
${jdk.version}
org.apache.maven.plugins
maven-clean-plugin
3.4.0
org.apache.maven.plugins
maven-resources-plugin
3.3.1
org.apache.maven.plugins
maven-war-plugin
3.4.0
org.apache.maven.plugins
maven-jar-plugin
3.4.2
org.apache.maven.plugins
maven-surefire-plugin
3.5.2
true
net.alchim31.maven
scala-maven-plugin
4.9.2
${scala.version}
${scala.version}
scala-compile-first
process-resources
compile
testCompile
compile-scala
compile
add-source
compile
test-compile-scala
test-compile
add-source
testCompile
org.apache.maven.plugins
maven-assembly-plugin
3.7.1
jar-with-dependencies
make-assembly
package
single
public
aliyun nexus
https://maven.aliyun.com/repository/public
true
public
aliyun nexus
https://maven.aliyun.com/repository/public
true
false
```
#### 集群配置文件
下载集群配置文件到 resources 目录下保存
#### hdfs配置文件
core-site.xml
hdfs-site.xml
#### hive配置文件
hive-stie.xml
#### log4j.properties
```
log4j.rootLogger=error, stdout,R
log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %5p --- [%50t] %-80c(line:%5L) : %m%n
log4j.appender.R=org.apache.log4j.RollingFileAppender
log4j.appender.R.File=../log/agent.log
log4j.appender.R.MaxFileSize=1024KB
log4j.appender.R.MaxBackupIndex=1
log4j.appender.R.layout=org.apache.log4j.PatternLayout
log4j.appender.R.layout.ConversionPattern=%d{yyyy-MM-dd HH:mm:ss,SSS} %5p --- [%50t] %-80c(line:%6L) : %m%n
```
#### spark工具类
hive.properties
```
warehouse.dir=hdfs://lihaozhe01:8020/user/hive/warehouse
metastore.uris=thrift://lihaozhe01:9083
```
mysql.properties
```
url=jdbc:mysql://lihaozhe01
schema=wanghaolin
user=root
password=Lihaozhe!!@@1122
```
###org.whl.util.spark.SparkUtil
```
package org.whl.util.spark
import org.apache.spark.SparkConf
import org.apache.spark.sql.SparkSession
import java.util.Properties
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 17:55
*/
object SparkUtil {
def apply(): SparkSession = {//这里定义了一个 apply 方法,它返回一个 SparkSession 实例。当我们调用 SparkUtil() 时,实际上会执行这个方法
System.setProperty("HADOOP_USER_NAME", "root")//这行代码设置了Hadoop的用户名称为“root”。这意味着后续的Hadoop作业将以“root”用户的身份执行
// System.setProperty("HADOOP_USER_NAME", "lhz")
val sparkConf = new SparkConf()//创建一个新的 SparkConf 对象。SparkConf 是用于配置Spark应用程序的类
if (!sparkConf.contains("spark.master")) {
sparkConf.setMaster("local")//设置Spark的运行模式为“local”,意味着作业将仅在本地运行,而不是在集群上
}
val prop = new Properties()//创建一个新的 Properties 对象,用于读取配置属性
prop.load(this.getClass.getClassLoader.getResourceAsStream("hive.properties"))
val sparkSession: SparkSession = SparkSession
.builder()//调用 SparkSession 的构建器方法,开始构建 SparkSession 实例
.appName("Spark SQL JDBC")//设置Spark应用程序的名称为 "Spark SQL JDBC"
.config(conf = sparkConf)//将之前创建的 sparkConf 配置传递给 SparkSession 的构建器
// .config("spark.sql.warehouse.dir", prop.getProperty("warehouse.dir"))
// .config("hive.metastore.uris", prop.getProperty("metastore.uris"))
.enableHiveSupport()//启用 Hive 支持,使 SparkSession 可以访问 Hive Metastore
.getOrCreate()//获取现有的 SparkSession 实例,如果不存在则创建一个新的实例
sparkSession//返回 SparkSession 实例
}
/**
* 获取 mysql 连接参数
*
* @param tableName 数据表名称 默认数据库名称为 lihaozhe
* @return
*/
def mysqlConnectionProperties(tableName: String): Properties = {
val prop = new Properties()
prop.load(this.getClass.getClassLoader.getResourceAsStream("mysql.properties"))
val schema = prop.remove("schema")
prop.put("tableName", schema + "." + tableName)
prop
}
/**
* 获取 mysql 连接参数
*
* @param schema 数据库名称
* @param tableName 数据表名称
* @return
*/
def mysqlConnectionProperties(schema: String, tableName: String): Properties = {
val prop = new Properties()
prop.load(this.getClass.getClassLoader.getResourceAsStream("mysql.properties"))
prop.remove("schema")
prop.put("tableName", schema + "." + tableName)
prop
}
/**
* 百分比转换
*
* @param number 浮点数
* @return 百分比
*/
def rate(number: Double): Double = {
BigDecimal(number).setScale(4, BigDecimal.RoundingMode.HALF_UP).toDouble * 100
}
}
```
###数据清洗
org.whl.pollution.etl.PollutionEtl
```
package org.whl.pollution.etl
import org.apache.spark.sql.SaveMode
import org.apache.spark.sql.functions.col
import org.apache.spark.sql.types.{DoubleType, StringType}
import org.whl.util.spark.SparkUtil
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 17:58
*/
object PollutionEtl {
def main(args: Array[String]): Unit = {
// 获取 SparkSession 对象
val sparkSession = SparkUtil()
// 定义源数据在HDFS文件系统的路径
val originPath: String = "/pollution-data/ods/pollution_dataset.csv"
// 读取 csv 文件 获取 dataFrame
val df = sparkSession.read.option("header", "true").csv(originPath)
// 字段集合
val cols = Array("Temperature","Humidity","PM25","PM10","NO2","SO2","CO","Proximity_to_Industrial_Areas","Population_Density","Air_Quality")
// 过滤掉有空值的数据行
df.na.drop(cols)
// 为字段指定数据类型
val table = df.select(
col("Temperature") cast (DoubleType),
col("Humidity") cast (DoubleType),
col("PM25") cast (DoubleType),
col("PM10") cast (DoubleType),
col("NO2") cast (DoubleType),
col("SO2") cast (DoubleType),
col("CO") cast (DoubleType),
col("Proximity_to_Industrial_Areas") cast (DoubleType),
col("Population_Density") cast (DoubleType),
col("Air_Quality") cast (StringType),
)
// 保存数据到 hive
table.write.mode(SaveMode.Overwrite).saveAsTable("wanghaolin.whl_pollution")
// 释放资源
sparkSession.stop()
}
}
```
#### hive 建表
_该步骤可以省略_
#### 数据接口
## 使用SpringBoot创建工程
springboot工程
pom.xml
```
4.0.0
org.springframework.boot
spring-boot-starter-parent
3.4.0
org.whl
air-api
0.0.1
air-api
air-api
21
21
${jdk.version}
${jdk.version}
${jdk.version}
utf-8
utf-8
UTF-8
true
true
2.17.0
3.17.0
1.2.23
2.1.4
2.0.53
2.0.53
2.11.0
1.1.2
5.8.32
5.11.2
2.0.0
2.18.0
2.9.18
1.18.34
3.5.9
3.0.4
9.1.0
3.4.0
0.289
org.springframework.boot
spring-boot-starter-thymeleaf
org.springframework.boot
spring-boot-starter-web
com.baomidou
mybatis-plus-spring-boot3-starter
${mybatis-plus.version}
com.baomidou
mybatis-plus-jsqlparser
${mybatis-plus.version}
com.baomidou
mybatis-plus-generator
${mybatis-plus.version}
org.springframework.boot
spring-boot-configuration-processor
true
org.projectlombok
lombok
true
org.springframework.boot
spring-boot-starter-test
test
cn.hutool
hutool-all
${hutool.version}
com.github.binarywang
java-testdata-generator
1.1.2
org.apache.commons
commons-lang3
${commons-lang3.version}
commons-io
commons-io
${commons-io.version}
com.google.code.gson
gson
${gson.version}
org.webjars
layui
${layui.version}
com.alibaba
druid
${druid.version}
com.mysql
mysql-connector-j
${mysql.version}
com.github.xiaoymin
knife4j-openapi3-jakarta-spring-boot-starter
4.5.0
org.apache.maven.plugins
maven-compiler-plugin
3.13.0
UTF-8
${jdk.version}
${jdk.version}
org.springframework.boot
spring-boot-maven-plugin
org.projectlombok
lombok
${project.name}
public
aliyun nexus
https://maven.aliyun.com/repository/public
true
public
aliyun nexus
https://maven.aliyun.com/repository/public
true
false
```
#### 导入工具类
导入 util 包

#### 导入配置类
导入 config 包

##1.全局异常处理
```
package org.whl.airapi.config;
import org.springframework.web.bind.annotation.ExceptionHandler;
import org.springframework.web.bind.annotation.ControllerAdvice;
import org.springframework.web.bind.annotation.RestControllerAdvice;
import org.whl.airapi.util.response.ResponseResult;
import org.whl.airapi.util.response.ResultCode;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 22:20
*/
@ControllerAdvice
@RestControllerAdvice
public class GlobalException {
@ExceptionHandler(Exception.class)
public ResponseResult defaultErrorHandler(Exception e) {
return new ResponseResult<>(ResultCode.EXCEPTION.getCode(), e.getMessage());
}
}
```
##2.Mybatis Plus 分页方言
```
package org.whl.airapi.config;
import com.baomidou.mybatisplus.annotation.DbType;
import com.baomidou.mybatisplus.extension.plugins.MybatisPlusInterceptor;
import com.baomidou.mybatisplus.extension.plugins.inner.PaginationInnerInterceptor;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 22:22
*/
@Configuration
public class MybatisPlusConfig {
@Bean
public MybatisPlusInterceptor mybatisPlusInterceptor() {
MybatisPlusInterceptor interceptor = new MybatisPlusInterceptor();
// 于 v3.5.9 起,PaginationInnerInterceptor 已分离出来。如需使用,则需单独引入 mybatis-plus-jsqlparser 依赖
interceptor.addInnerInterceptor(new PaginationInnerInterceptor(DbType.MYSQL));
return interceptor;
}
}
```
##3.openApi 配置
```
package org.whl.airapi.config;
import io.swagger.v3.oas.models.OpenAPI;
import io.swagger.v3.oas.models.info.Contact;
import io.swagger.v3.oas.models.info.Info;
import lombok.extern.slf4j.Slf4j;
import org.springframework.boot.web.context.WebServerInitializedEvent;
import org.springframework.context.ApplicationListener;
import org.springframework.context.annotation.Bean;
import org.springframework.context.annotation.Configuration;
import java.net.Inet4Address;
import java.net.UnknownHostException;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 22:23
*/
@Configuration
@Slf4j
public class OpenApiConfig implements ApplicationListener {
@Bean
public OpenAPI springOpenAPI() {
Contact contact = new Contact();
contact.setName("王浩霖");
contact.setUrl("https://space.bilibili.com/480308139");
contact.setEmail("whl321197@qq.com");
// 访问路径:http://localhost:8080/swagger-ui/index.html
// 访问路径:http://localhost:8080/doc.html
return new OpenAPI().info(new Info()
.title("SpringBoot API")
.description("SpringBoot Simple Application")
.contact(contact)
.version("1.0.0"));
}
@Override
public void onApplicationEvent(WebServerInitializedEvent event) {
try {
//获取IP
String hostAddress = Inet4Address.getLocalHost().getHostAddress();
//获取端口号
int port = event.getWebServer().getPort();
//获取应用名
String applicationName = event.getApplicationContext().getApplicationName();
// TODO:这个localhost改成host地址
log.info("项目启动启动成功!接口文档地址: http://" + hostAddress + ":" + port + applicationName + "/doc.html");
} catch (UnknownHostException e) {
e.printStackTrace();
}
}
}
```
#### 图表VO类
**org.whl.airapi.vo.Chart**
```
package org.whl.airapi.vo;
import lombok.AllArgsConstructor;
import lombok.Getter;
import lombok.NoArgsConstructor;
import lombok.Setter;
import org.whl.airapi.util.annotation.RowKeyAnnotation;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/18 17:23
*/
@Getter
@Setter
@NoArgsConstructor
@AllArgsConstructor
public class Chart {
@RowKeyAnnotation
private String name;
private Object value;
}
```
#### application.yml
```
server:
port: 8080
servlet:
context-path: /api
spring:
application:
name: air-api
jackson:
serialization:
FAIL_ON_EMPTY_BEANS: false
# json 序列化排除值为 null 的属性
default-property-inclusion: non_null
# 配置 Date 类的时间格式,如果不涉及可以不加
date-format: yyyy-MM-dd HH:mm:ss
# 配置 Date 类的时区,如果不涉及可以不加
time-zone: GMT+8
thymeleaf:
#缓冲的配置
cache: false
check-template: true
check-template-location: true
#开启MVC thymeleaf 视图解析
enabled: true
#模板的模式,支持 HTML, XML TEXT JAVASCRIPT
mode: HTML
#编码 可不用配置
encoding: UTF-8
#配置模板路径,默认是templates,可以不用配置
prefix: classpath:templates
# 文件后缀名
suffix: .html
#内容类别,可不用配置
servlet:
content-type: text/html;charset=utf-8
datasource:
driver-class-name: com.mysql.cj.jdbc.Driver
type: com.alibaba.druid.pool.DruidDataSource
url: jdbc:mysql://localhost:3306/wanghaolin?allowPublicKeyRetrieval=true&sslMode=DISABLED&useUnicode=true&characterEncoding=UTF8&useSSL=false&useServerPrepStmts=false&rewriteBatchedStatements=true&cachePrepStmts=true&allowMultiQueries=true&serverTimeZone=Aisa/Shanghai
username: root
password: 123456
mybatis-plus:
# 如果是放在src/main/java目录下 classpath:/com/lihaozhe/*/mapper/*Mapper.xml
# 如果是放在resource目录 classpath:/mapper/*Mapper.xml
mapper-locations: classpath:mapper/*.xml
#实体扫描,多个package用逗号或者分号分隔
type-aliases-package: org.whl.airapi.dto,org.whl.airapi.vo
configuration:
#配置返回数据库(column下划线命名&&返回java实体是驼峰命名),自动匹配无需as(没开启这个,SQL需要写as: select user_id as userId)
map-underscore-to-camel-case: true
cache-enabled: false
#配置JdbcTypeForNull, oracle数据库必须配置
jdbc-type-for-null: 'null'
log-impl: org.apache.ibatis.logging.stdout.StdOutImpl
#是否激活 swagger true or false
springdoc:
swagger-ui:
path: /swagger-ui
tags-sorter: alpha
operations-sorter: alpha
api-docs:
path: /v3/api-docs
group-configs:
- group: 'default'
paths-to-match: '/**'
packages-to-scan: org.whl.airapi.controller
# knife4j的增强配置,不需要增强可以不配
knife4j:
enable: true
setting:
language: zh_cn
basic:
# 启用基本认证
enable: true
# 设置用户名
username: admin
# 设置密码
password: wanghaolin
```
###dto传输对象
**org.whl.airapi.dto.AirPollution**
```
package org.whl.airapi.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import org.apache.ibatis.annotations.Mapper;
import org.whl.airapi.dto.AirPollution;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 9:59
*/
@Mapper
public interface AirPollutionMapper extends BaseMapper {
}
```
**org.whl.airapi.dto.Hum**
```
package org.whl.airapi.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import org.apache.ibatis.annotations.Mapper;
import org.whl.airapi.dto.AirPollution;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 9:59
*/
@Mapper
public interface AirPollutionMapper extends BaseMapper {
}
```
###mapper持久层
**org.whl.airapi.mapper.AirPollutionMapper**
```
package org.whl.airapi.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import org.apache.ibatis.annotations.Mapper;
import org.whl.airapi.dto.AirPollution;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 9:59
*/
@Mapper
public interface AirPollutionMapper extends BaseMapper {
}
```
**org.whl.airapi.mapper.HumMapper**
```
package org.whl.airapi.mapper;
import com.baomidou.mybatisplus.core.mapper.BaseMapper;
import org.apache.ibatis.annotations.Mapper;
import org.whl.airapi.dto.Hum;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 9:59
*/
@Mapper
public interface HumMapper extends BaseMapper {
}
```
###service服务层
**org.whl.airapi.service.AirPollutionService**
package org.whl.airapi.service;
```
import com.baomidou.mybatisplus.extension.service.IService;
import org.whl.airapi.dto.AirPollution;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:09
*/
public interface AirPollutionService extends IService {
}
```
**org.whl.airapi.service.HumService**
```
package org.whl.airapi.service;
import com.baomidou.mybatisplus.extension.service.IService;
import org.whl.airapi.dto.Hum;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:09
*/
public interface HumService extends IService {
}
```
**org.whl.airapi.service.impl.AirPollutionServiceImpl**
```
package org.whl.airapi.service.impl;
import com.baomidou.mybatisplus.extension.service.impl.ServiceImpl;
import org.springframework.stereotype.Service;
import org.whl.airapi.dto.AirPollution;
import org.whl.airapi.mapper.AirPollutionMapper;
import org.whl.airapi.service.AirPollutionService;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:10
* extends ServiceImpl implements AirPollutionService
*/
@Service
public class AirPollutionServiceImpl extends ServiceImpl implements AirPollutionService {
}
```
**org.whl.airapi.service.impl.HumServiceImpl**
```
package org.whl.airapi.service.impl;
import com.baomidou.mybatisplus.extension.service.impl.ServiceImpl;
import org.springframework.stereotype.Service;
import org.whl.airapi.dto.Hum;
import org.whl.airapi.mapper.HumMapper;
import org.whl.airapi.service.HumService;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:10
* extends ServiceImpl implements AirPollutionService
*/
@Service
public class HumServiceImpl extends ServiceImpl implements HumService {
}
```
###controller控制层
**org.whl.airapi.controller.AirPollutionController**
```
package org.whl.airapi.controller;
import lombok.RequiredArgsConstructor;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.whl.airapi.dto.AirPollution;
import org.whl.airapi.service.AirPollutionService;
import org.whl.airapi.util.list.ListToChart;
import java.util.List;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:12
*/
@RestController
@RequestMapping("/Air")
@RequiredArgsConstructor
public class AirPollutionController {
private final AirPollutionService airPollutionService;
@GetMapping("/Pollution")
public List> airPollutionZhuxingtu() throws IllegalAccessException {
List airpo = airPollutionService.list();
// 使用 ListToChart 工具类生成柱状图和折线图的数据
List> chartData = ListToChart.baseLine(airpo, AirPollution.class);
// 返回结果
return chartData;
}
}
```
**org.whl.airapi.controller.HumController**
```
package org.whl.airapi.controller;
import lombok.RequiredArgsConstructor;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestMapping;
import org.springframework.web.bind.annotation.RestController;
import org.whl.airapi.dto.Hum;
import org.whl.airapi.service.HumService;
import org.whl.airapi.util.list.ListToChart;
import java.util.List;
/**
* @author 王浩霖
* @version 1.0.0 2024/12/26 10:12
*/
@RestController
@RequestMapping("/Air")
@RequiredArgsConstructor
public class HumController {
private final HumService airPollutionService;
@GetMapping("/Hum")
public List> airPollutionZhuxingtu() throws IllegalAccessException {
List airpo = airPollutionService.list();
// 使用 ListToChart 工具类生成柱状图和折线图的数据
List> chartData = ListToChart.baseLine(airpo, Hum.class);
// 返回结果
return chartData;
}
}
```
##运行数据接口

##可得到接口文档和数据



#### vue+echarts+datav
##### 1、创建工程
```
npm create vite@latest
```
```
cd heart-view
npm install
npm run dev
```
#### 2、安装项目依赖模块
```
npm install @types/node --save-dev
npm install vue-router@4
npm install animate.css --save
npm install gsap --save
npm install fetch --save
npm install axios --save
npm install pinia
npm install less less-loader -D
npm install sass sass-loader --save-dev
npm install scss scss-loader --save-dev
npm install element-plus --save
npm install -D unplugin-vue-components unplugin-auto-import
npm install echarts echarts-wordcloud --save
npm install @kjgl77/datav-vue3
```
#### 3、配置项目
vite.config.ts
## **tsconfig.node.json**
```
{
"compilerOptions": {
"tsBuildInfoFile": "./node_modules/.tmp/tsconfig.node.tsbuildinfo",
"target": "ES2022",
"lib": ["ES2023"],
"module": "ESNext",
"skipLibCheck": true,
/* Bundler mode */
"moduleResolution": "Bundler",
"allowImportingTsExtensions": true,
"isolatedModules": true,
"moduleDetection": "force",
"noEmit": true,
/* Linting */
"strict": true,
"noUnusedLocals": true,
"noUnusedParameters": true,
"noFallthroughCasesInSwitch": true,
"noUncheckedSideEffectImports": true,
/* 路径别名 */
"types": ["node"],
"baseUrl": ".",
"paths": {
"@/*": ["src/*"]
}
},
"include": ["vite.config.ts"]
}
```
**tsconfig.json**
```
//{
// "files": [],
// "references": [
// { "path": "./tsconfig.app.json" },
// { "path": "./tsconfig.node.json" }
// ],
//}
{
"compilerOptions": {
"allowSyntheticDefaultImports": true,
"moduleResolution": "node",
"esModuleInterop": true,
// 其他配置项
}
}
```
#### 3.1 配置路径别名
vite.config.ts
```
import {defineConfig} from 'vite'
import vue from '@vitejs/plugin-vue'
import AutoImport from 'unplugin-auto-import/vite'
import Components from 'unplugin-vue-components/vite'
import {ElementPlusResolver} from 'unplugin-vue-components/resolvers'
import { resolve } from 'path'
const base_url:string = 'http://192.168.10.1:8080'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [
vue(),
AutoImport({
resolvers: [ElementPlusResolver()],
}),
Components({
// resolvers: [ElementPlusResolver({importStyle: "sass"})],
resolvers: [ElementPlusResolver()],
}),
],
server: {
host: '0.0.0.0',
strictPort: true,
open: false,
proxy: {
// 使用 proxy 实例
'/api': {
target: base_url,
changeOrigin: true,
rewrite: (path) => path.replace(/^\/api/, 'api'),
},
}
},
resolve:{//路径别名
alias:{
'@': resolve(__dirname,'./src')
}
},
css: {
preprocessorOptions: {
scss: {
// additionalData: '@import "./src/styles/scss_variable.scss";'
additionalData: `@use "./src/styles/scss_variable.scss" as *;`,
}
}
}
})
```
#### 3.2 引入路由
#### 3.2.1 创建路由出口视图并在 app.vue 组件中引入 路由出口视图
**src/components/DataV.vue**
```
```
##app.vue
```
```
#### 3.2.2 创建路由文件
src/routers/router.ts
```
// 1. 定义路由组件.
// 也可以从其他文件导入
import {createMemoryHistory, createRouter, RouteRecordRaw} from 'vue-router'
import Main from '../views/Main/IndexView.vue'
// 2. 定义一些路由
const routes: Array = [
{
name: 'main',
path: '/',
component: Main,
},
// 处理 404
{path: '/:pathMatch(.*)*', redirect: "/"},
]
// 3. 创建路由实例并传递 `routes` 配置
// 你可以在这里输入更多的配置,但我们在这里
// 暂时保持简单
const router = createRouter({
// 4. 内部提供了 history 模式的实现。
// memory 模式。createMemoryHistory
// hash 模式。createWebHashHistory
// html5 模式。createWebHistory
history: createMemoryHistory(),
routes,
})
export default router
```
#### 3.2.3 引入路由配置文件
main.ts
```
import {createApp} from 'vue';
import './style.css';
import App from './App.vue';
import router from "./routers/router.ts";
const app = createApp(App);
app.use(router);
app.mount('#app');
```
#### 3.3 引入 animate.css 动画库
main.ts
```
import {createApp} from 'vue';
import './style.css';
import 'animate.css'
import App from './App.vue';
import router from "./routers/router.ts";
const app = createApp(App);
app.use(router);
app.mount('#app');
```
##### 3.4 引入 pinia
```
import {createApp} from 'vue';
import './style.css';
import 'animate.css'
import App from './App.vue';
import router from "./routers/router.ts";
import {createPinia} from "pinia";
const app = createApp(App);
// 需要注意的是从pinia中解构出来的createPinia是一个函数,挂载前需要先调用执行
// const pinia = createPinia()
// app.use(pinia)
app.use(createPinia())
app.use(router);
app.mount('#app');
```
##### 3.4.1 编写状态管理文件strore.ts
src/strors/store.ts
```
import {defineStore} from 'pinia'
export const useStore = defineStore('main', () => {
// ref() 和 reactive() 就是 state 属性
// computed() 就是 getters
// function() 就是 actions
return {}
});
```
##### 3.4.2 引入 pinia
main.ts
```
import {createApp} from 'vue';
import './style.css';
import 'animate.css'
import App from './App.vue';
import router from "./routers/router.ts";
import {createPinia} from "pinia";
const app = createApp(App);
// 需要注意的是从pinia中解构出来的createPinia是一个函数,挂载前需要先调用执行
// const pinia = createPinia()
// app.use(pinia)
app.use(createPinia())
app.use(router);
app.mount('#app');
```
##### 3.5 配置 scss
#### 3.5.1 编写scss 变量存储文件 scss_variable.scss
**src/styles/scss_variable.scss**
```
// 单个图表 宽度和高度
$chart-width: 100%;
$chart-height: 100%;
```
##### 3.5.2 引入 scss 变量配置文件 scss_variable.scss
vite.config.ts
```
import {defineConfig} from 'vite'
import { resolve } from 'path'
const base_url:string = 'http://192.168.10.1:8080'
// https://vitejs.dev/config/
export default defineConfig({
server: {
host: '0.0.0.0',
strictPort: true,
open: false,
proxy: {
// 使用 proxy 实例
'/api': {
target: base_url,
changeOrigin: true,
rewrite: (path) => path.replace(/^\/api/, 'api'),
},
}
},
resolve:{//路径别名
alias:{
'@': resolve(__dirname,'./src')
}
},
css: {
preprocessorOptions: {
scss: {
// additionalData: '@import "./src/styles/scss_variable.scss";'
additionalData: `@use "./src/styles/scss_variable.scss" as *;`,
}
}
}
})
```
#### 3.6 配置 element plus
完整引入
按需导入
自动导入(本配置使用自动加载)
**vite.config.ts**
```
import {defineConfig} from 'vite'
import vue from '@vitejs/plugin-vue'
import AutoImport from 'unplugin-auto-import/vite'
import Components from 'unplugin-vue-components/vite'
import {ElementPlusResolver} from 'unplugin-vue-components/resolvers'
import { resolve } from 'path'
const base_url:string = 'http://192.168.10.1:8080'
// https://vitejs.dev/config/
export default defineConfig({
plugins: [
vue(),
AutoImport({
resolvers: [ElementPlusResolver()],
}),
Components({
// resolvers: [ElementPlusResolver({importStyle: "sass"})],
resolvers: [ElementPlusResolver()],
}),
],
server: {
host: '0.0.0.0',
strictPort: true,
open: false,
proxy: {
// 使用 proxy 实例
'/api': {
target: base_url,
changeOrigin: true,
rewrite: (path) => path.replace(/^\/api/, 'api'),
},
}
},
resolve:{//路径别名
alias:{
'@': resolve(__dirname,'./src')
}
},
css: {
preprocessorOptions: {
scss: {
// additionalData: '@import "./src/styles/scss_variable.scss";'
additionalData: `@use "./src/styles/scss_variable.scss" as *;`,
}
}
}
})
```
#### 3.7 echarts公共组件
src/components/ChartLhz.vue
```
```
#### 3.8 引入 datav
##commons.css
```
html, body {
width: 100%;
height: 100%;
padding: 0;
margin: 0;
}
```
##main.ts
```
import {createApp} from 'vue'
// import './style.css'
import './assets/css/commons.css'
import 'animate.css'
import {createPinia} from "pinia"
import App from './App.vue'
import DataVVue3 from '@kjgl77/datav-vue3'
import router from './routers/router.ts'
// import axios from 'axios'
// axios.defaults.headers.common['Authorization'] = 'Bearer ' + localStorage.getItem('token')
// axios.defaults.baseURL = 'http://localhost:8080'
const app = createApp(App)
// 需要注意的是从pinia中解构出来的createPinia是一个函数,挂载前需要先调用执行
// const pinia = createPinia()
// app.use(pinia)
app.use(createPinia())
app.use(DataVVue3)
app.use(router)
app.mount('#app')
```
#### 3.9 工具
src/utils/lhz-util.ts
```
/**
* 数据转换
*/
class Transpose {
/**
* 行列转换
* @param matrix 二维矩阵
* @return 行列转换后的二维矩阵
*/
transposeMatrix(matrix: any[][]): any[][] {
// 获取原矩阵的行数和列数
const rows = matrix.length;
const cols = matrix[0].length;
// 初始化转置矩阵
const transposed = new Array(cols).fill(0).map(() => new Array(rows).fill(0));
// 转置操作
for (let i = 0; i < rows; i++) {
for (let j = 0; j < cols; j++) {
transposed[j][i] = matrix[i][j];
}
}
return transposed;
}
/**
* 中国 省份 直辖市 特别行政区 名称转换
* @param regionName 转换前地区名称只有完整地区名称前两个字
* @return 完整地区名称
*/
formatRegionName(regionName: string): string {
let map = new Map();
map.set("北京", "北京市");
map.set("天津", "天津市");
map.set("河北", "河北省");
map.set("山西", "山西省");
map.set("内蒙古", "内蒙古自治区");
map.set("辽宁", "辽宁省");
map.set("吉林", "吉林省");
map.set("黑龙江", "黑龙江省");
map.set("上海", "上海市");
map.set("江苏", "江苏省");
map.set("浙江", "浙江省");
map.set("安徽", "安徽省");
map.set("福建", "福建省");
map.set("江西", "江西省");
map.set("山东", "山东省");
map.set("河南", "河南省");
map.set("湖北", "湖北省");
map.set("湖南", "湖南省");
map.set("广东", "广东省");
map.set("广西", "广西壮族自治区");
map.set("海南", "海南省");
map.set("重庆", "重庆市");
map.set("四川", "四川省");
map.set("贵州", "贵州省");
map.set("云南", "云南省");
map.set("西藏", "西藏自治区");
map.set("陕西", "陕西省");
map.set("甘肃", "甘肃省");
map.set("青海", "青海省");
map.set("青海", "青海省");
map.set("宁夏", "宁夏回族自治区");
map.set("新疆", "新疆维吾尔自治区");
map.set("台湾", "台湾省");
map.set("香港", "香港特别行政区");
map.set("澳门", "澳门特别行政区");
return map.get(regionName) || regionName;
}
}
/**
* 身份证信息
*/
class idCard {
/**
* 获取出生日期
* @param idCard 身份证
* @return 出生日期 yyyy-MM-dd 格式
*/
getBirthday(idCard: string): string {
// return idCard.substring(6, 10) + '-' + idCard.substring(10, 12) + '-' + idCard.substring(12, 14);
let array = [];
array.push(idCard.substring(6, 10));
array.push(idCard.substring(10, 12));
array.push(idCard.substring(12, 14));
return array.join('-');
}
/**
* 获取年龄
* @param idCard 身份证
* @return 年龄
*/
getAge(idCard: string) {
let date = new Date();
let age = date.getFullYear() - parseInt(idCard.substring(6, 10)) - 1;
let month = date.getMonth() + 1;
if (parseInt(idCard.substring(10, 12)) < month || parseInt(idCard.substring(10, 12)) == month && parseInt(idCard.substring(10, 12)) <= date.getDate()) {
age++;
}
return age;
}
/**
* 获取性别
* @param idCard 身份证
* @returns 1代表男性 0代表女性
*/
getGender(idCard: string): number {
return parseInt(idCard.substring(16, 17)) % 2 === 1 ? 1 : 0;
}
}
export default {
transpose: new Transpose(),
idCardUtil: new idCard(),
};
```
#### 4、页面布局
#### 4.1 全局scss
src/styles/scss_variable.scss
```
html, body {
width: 100vw;
height: 100vh;
margin: 0;
padding: 0;
}
#app_container {
margin: 0;
padding: 0;
height: 100vh;
background-color: #1F2327;
color: #fff;
#dv-full-screen-container {
// 全屏容器可在此处设置背景图片
background-image: url('@/assets/img/bg.png');
background-size: 100% 100%;
box-shadow: 0 0 3px blue;
display: flex;
flex-direction: column;
}
}
.whl-row, .whl-col {
width: 100%;
height: 100%;
display: flex;
flex: 1;
}
.whl-row {
flex-direction: row;
justify-content: space-between;
}
.whl-col {
flex-direction: column;
align-items: center;
}
```
#### 4.2 主页布局
src/components/DataV.vue
```
```
该页面使用了element-plus 布局标签
#### 4.3 头部标题
**src/views/TopHeader/IndexView.vue**
```
```
####4.4 主页引入头部组件
src/components/DataV.vue
```
```
#### 1. 可视化布局
#### 1.1 数据分析
数据分析
**在pollution-analysis中**
**org.whl.pollution.analysis.TemAirquality**
```
package org.whl.pollution.analysis
import org.apache.spark.sql.functions._
import org.apache.spark.sql.{Row, SparkSession}
import org.whl.util.spark.SparkUtil
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 20:55
*/
object TemAirquality {
def main(args: Array[String]): Unit = {
// val spark = SparkSession.builder.appName("AirQualityAnalysis").enableHiveSupport().getOrCreate()
val spark: SparkSession = SparkUtil()
// 从Hive数据库加载数据
val df = spark.read.table("wanghaolin.whl_pollution")
// 定义温度分段
val tempRanges = Array((10, 20), (20, 30), (30, 40), (40, 50), (50, 60))
// 初始化结果DataFrame的列
var resultDF = spark.createDataFrame(Seq(
("10~20", 0, 0, 0, 0),
("20~30", 0, 0, 0, 0),
("30~40", 0, 0, 0, 0),
("40~50", 0, 0, 0, 0),
("50~60", 0, 0, 0, 0)
)).toDF("Temperature", "Moderate", "Good", "Hazardous", "Poor")
//调用隐式转换
import spark.implicits._
// 对每个温度范围进行过滤和计数
tempRanges.foreach { range =>
val filteredDF = df.filter($"Temperature" >= range._1 && $"Temperature" < lit(range._2))
.groupBy("Air_Quality")
.count()
.collect()
// 更新结果DataFrame
filteredDF.foreach { case Row(quality: String, count: Long) =>
val index = quality match {
case "Moderate" => 0
case "Good" => 1
case "Hazardous" => 2
case "Poor" => 3
}
// resultDF = resultDF.withColumn(s"$" + quality, resultDF($"$" + quality) + lit(count.toInt))
resultDF = resultDF.withColumn(quality, resultDF(quality) + lit(count.toInt))
}
}
// 将结果DataFrame注册为临时视图,以便写入Hive表
resultDF.createOrReplaceTempView("temp_view")
// 将结果写入Hive表
spark.sql("CREATE TABLE IF NOT EXISTS wanghaolin.whl_tem_air AS SELECT * FROM temp_view").collect()
// 停止SparkSession
spark.stop()
}
}
```
#### 服务器提交
```
spark-submit --master yarn --deploy-mode cluster --class org.whl.pollution.analysis.TemAirquality pollution-analysis-jar-with-dependencies.jar
```
#### 分析结果同步到mysql
```
package org.whl.pollution.mysql
import org.apache.spark.sql.{SaveMode, SparkSession}
import org.whl.util.spark.SparkUtil
import java.util.Properties
/**
* @author 王浩霖
* @version 1.0.0 2024/12/25 21:54
*/
object TemAirMysql {
def main(args: Array[String]): Unit = {
// 获取 SparkSession 对象
val sparkSession: SparkSession = SparkUtil()
// 同步分析结果到mysql
import sparkSession.sql
// 每个年龄段患有心血管疾病患者人数
val df_all = sql("select * from wanghaolin.whl_tem_air")
val properties: Properties = SparkUtil.mysqlConnectionProperties("wanghaolin", "whl_tem_air")
val url: String = properties.getProperty("url")
val tableName: String = properties.getProperty("tableName")
df_all.write.mode(SaveMode.Overwrite).jdbc(url = url, table = tableName, connectionProperties = properties)
sparkSession.stop()
}
}
```
#### 1.2 数据可视化布局
src/views/Main/IndexView.vue
```
```
#### 1.3 **20~30℃时空气状况分布**
**src/views/Top/Left.vue**
```
```
#### 1.4 **北上广空气含量分布**
**src/views/Top/Center.vue**
```
```
#### 1.5 **不同湿度下的空气成分含量**
**src/views/Top/Right.vue**
```
1
```
#### 1.6 **不同温度下的空气质量**
**src/views/Bottom/Bottom.vue**
```
```
#### 2. **主页面布局**
#### 2.1 IndexView.vue
**src/views/Main/IndexView.vue**
```
```

#### 参与贡献
大数据5班第8组
组长:王浩霖
组员:崔卓斐,张三君
指导老师:李胜龙
#### 特技
1. 项目的数据在Kaggle等网站获取数据
2. 数据上传到HDFS后,使用Scala与Spark进行数据清洗和数据分析
3. 搭建数据接口
4. 以vue框架搭建数据可视化项目,通过echarts实现图片显示
5. 通过该项目可以分析得到不同温度下的空气质量和不同湿度下的空气成分含量变化。