将数据从mysql迁移到clickhouse
1、关于clickhouse的介绍https://zhuanlan.zhihu.com/p/370201180 2、京东使用clickhouse存储秒级100G日志数据:https://new.qq.com/omn/20220408/20220408A03TA600.html 3、将MySQL数据迁徙到clickhouse内操作步骤:#clickhouse-client🙂 create database newDB;🙂 use newDB;–导入数据:CREATE TABLE Orders ENGINE = MergeTree ORDER BY OrderID AS SELECT * FROM mysql(‘10.42.134.136:3307’, ‘DBNAME’, ‘Orders’, ‘root’, ‘PASSWORD’); Ok.0 rows in set. Elapsed: 1832.114 sec....
hive3.1.2 on spark
1. 安装java(openjdk8) 2.安装Mysqlwget https://dev.mysql.com/get/mysql57-community-release-el7-8.noarch.rpmrpm -ivh mysql57-community-release-el7-8.noarch.rpmcd /etc/yum.repos.d/rpm --import https://repo.mysql.com/RPM-GPG-KEY-mysql-2022yum -y install mysql-serversystemctl start mysqldgrep 'temporary password' /var/log/mysqld.logmysql -uroot -pset global validate_password_policy=LOW;set global validate_password_length=4;ALTER USER 'root'@'localhost' ID...
spark版hello word
import org.apache.spark.SparkContextimport org.apache.spark.SparkContext._import org.apache.spark.SparkConf object WordCount { def main(args: Array[String]) { val inputFile = "/Users/artefact/software/spark-3.1.3-bin-hadoop3.2/data/wordcount.txt" val conf = new SparkConf().setAppName("WordCount").setMaster("local") val sc = new SparkContext(conf) val textFile = sc.textFile(inputFile) val wordCount = textFile .flatMap(_.split(" ")) //.flat...
idea 配置 scala 2.12 spark 3.0.2 开发环境
基本开发环境下载对应包maven:https://mvnrepository.com/search?q=sparkspark:http://spark.apache.org/downloads.htmlscala:https://www.scala-lang.org/download/2.12.12.html注意 spark 3 使用的版本是 scala 2.12.*java:https://www.oracle.com/java/technologies/javase/javase-jdk8-downloads.html编译器配置下载scala 插件工程构建配置scala 插件构建scala 本地jar 包工程file -》 project structure -》 添加下载的spark 中的jar 包代码:import org.apache.spark.SparkContext...
hive权限配置注意事项2
root>beeline -u jdbc:hive2://localhost:10000在使用beeline连接hive 的时候会报权限验证失败的错误:org.apache.hadoop.security.AccessControlException: Permission denied: user=anonymous, access=EXECUTE, inode="/tmp":root:supergroup:drwx------ 需要在hdfs.site.xml增加:<property><name>dfs.permissions</name><value>false</value></property> hive-site.xml配置:<?xml version="1.0"?><?xml-stylesheet type=...
centos安装配置hive
本文开篇,附 Hive 相关内容地址:Hive官网:http://hive.apache.orgHive官方参考文档:https://cwiki.apache.org/confluence/display/Hive/GettingStartedHive各版本下载地址:http://archive.apache.org/dist/hiveHive GitHub地址:https://github.com/apache/hive 1.前提hive 安装之前,需要以 Hadoop 集群为前提,Hive 是执行在 Hadoop集群上的。Hadoop集群安装,参考:CentOS 7.7 安装 Hadoop 2.10.1集群CentOS 7.7 安装 Hadoop 3.1.3集群Hadoop HA版集群安装(待补充)已安装好的 MySQL数据库...
docker安装Mysql5.7
docker run -itd --name mysql -p 3306:3306 -e MYSQL_ROOT_PASSWORD=123456 mysql:5.7 docker run -p 3306:3306 --name mysql -v /root/mysql/conf:/etc/mysql/conf.d -v /root/mysql/logs:/logs -v /root/mysql/data:/var/lib/mysql -e MYSQL_ROOT_PASSWORD=1234567 -d mysql:5.7 docker-compose安装mysql: version : '3'services: mysql: container_name: mysql image: mysql:5.7 build: context: ./mysql ports: - "3307:3306" vol...
deepin单节点启动hadoop
下载HADOOP2、Core-site.xml文件配置注意:以下所有文件均在对应文件夹下面 可以命令行编辑 也可在vscode 编辑<span class="token punctuation"><</span><span class="token variable">configuration</span><span class="token punctuation">></span><span class="token punctuation"><</span><span class="token variable">property</span><span class="tok...
b站VIP视频解析
// ==UserScript==// @name JIEXIE// @namespace http://tampermonkey.net/// @version 0.1// @description try to take over the world!// @author You// @match https://www.bilibili.com/*// @grant none// @require https://apps.bdimg.com/libs/jquery/1.4.2/jquery.min.js// ==/UserScript== (function() { 'use strict';console.log($); setTimeout(()=>{ $(".bpx-player-primary-area").html("<iframe style='height:545px;...
springboot登录失败3次后需要验证码的设计及实现
前言:不用验证码是方便登陆,用验证码是为了防止暴力破解。为了即能满足方便,同时防止暴力破解,需要使用用户登陆失败3次后出现验证码。 1. 登陆失败次数记录, 在login 中查询并记录用户登陆失败次数WARNING: 无论验证成功还是失败,前端都刷新验证码。后端验证码用一次就失效@RequestMapping("/login") public Result login(@RequestBody LoginBody loginBody){ int errorTimes = redis.get(loginBody.getUsername()+"-loginError"); bool needKaptcha = false; if(errorTimes>2...