当前位置: 首页 > news >正文

高明网站制作企业网站制作与维护

高明网站制作,企业网站制作与维护,精品课程网站的建设,Wordpress简约卡片理论上可以eventtime processtime混用,但是下面代码测试发现bug,输入一条数据会一直输出. flink github无法提bug/问题. apache jira账户新建后竟然flink又需要一个账户,放弃 bug复现操作 idea运行代码后 往source kafka发送一条数据 a,1,1690304400000 可以看到无限输出…理论上可以eventtime processtime混用,但是下面代码测试发现bug,输入一条数据会一直输出. flink github无法提bug/问题. apache jira账户新建后竟然flink又需要一个账户,放弃 bug复现操作 idea运行代码后 往source kafka发送一条数据   a,1,1690304400000 可以看到无限输出: 理论上时间语义不建议混用,但是在rich函数中的确可以做到混用且正常使用 问题复现代码 package com.yy.flinkWindowAndTriggerimport com.yy.flinkWindow.M1 import org.apache.flink.api.common.eventtime.WatermarkStrategy import org.apache.flink.configuration.{Configuration, RestOptions} import org.apache.flink.connector.kafka.source.KafkaSource import org.apache.flink.connector.kafka.source.enumerator.initializer.OffsetsInitializer import org.apache.flink.streaming.api.functions.timestamps.BoundedOutOfOrdernessTimestampExtractor import org.apache.flink.streaming.api.scala._ import org.apache.flink.streaming.api.windowing.assigners.TumblingEventTimeWindows import org.apache.flink.streaming.api.windowing.time.Time import org.apache.flink.streaming.api.windowing.time.Time.seconds import org.apache.flink.streaming.api.windowing.triggers.{ContinuousProcessingTimeTrigger, CountTrigger, ProcessingTimeTrigger} import org.apache.flink.streaming.api.windowing.windows.TimeWindow import org.joda.time.Secondsobject flinkEventWindowAndProcessTriggerBUGLearn {def main(args: Array[String]): Unit {// flink 启动本地webuival conf new Configurationconf.setInteger(RestOptions.PORT, 28080)// val env StreamExecutionEnvironment.getExecutionEnvironmentval env StreamExecutionEnvironment.createLocalEnvironmentWithWebUI(conf)// val env StreamExecutionEnvironment.getExecutionEnvironmentenv.setParallelism(1)env.configure(conf)/*kafka输入:a,1,1690304400000 //对应 2023-07-26 01:00:00 (无限输出) //如果传入 a,1,1693037756000 对应:2023-08-26 16:15:56 (1条/s)a,1,7200000 // 1970-01-1 10:00:00*/val brokers 172.18.105.147:9092val source KafkaSource.builder[String].setBootstrapServers(brokers).setTopics(t1).setGroupId(my-group-23asdf46).setStartingOffsets(OffsetsInitializer.latest())// .setDeserializer() // 参数: KafkaRecordDeserializationSchema.setDeserializer(new M1()).build()val ds1 env.fromSource(source, WatermarkStrategy.noWatermarks(), Kafka Source)val s1 ds1.map(_.split(,)).map(x C1(x(0), x(1).toInt, x(2).toLong)) // key number 时间戳.assignTimestampsAndWatermarks(new OTAWatermarks(Time.seconds(0))).keyBy(_.f1).window(TumblingEventTimeWindows.of(seconds(10))).trigger(ContinuousProcessingTimeTrigger.of[TimeWindow](seconds(10L))).reduce((x, y) C1(x.f1, x.f2 y.f2, 100L))s1.print()env.execute(KafkaNewSourceAPi)}// 乱序流class OTAWatermarks(time: Time) extends BoundedOutOfOrdernessTimestampExtractor[C1](time) {override def extractTimestamp(element: C1): Long {element.f3}}// key num timestampcase class C1(f1: String, f2: Int, f3: Long) }- - maven pom project xmlnshttp://maven.apache.org/POM/4.0.0 xmlns:xsihttp://www.w3.org/2001/XMLSchema-instancexsi:schemaLocationhttp://maven.apache.org/POM/4.0.0 http://maven.apache.org/xsd/maven-4.0.0.xsdmodelVersion4.0.0/modelVersiongroupIdorg.example/groupIdartifactIdFlinkLocalDemo/artifactIdversion1.0-SNAPSHOT/versionpackagingjar/packagingnameFlinkLocalDemo/nameurlhttp://maven.apache.org/urlpropertiesproject.build.sourceEncodingUTF-8/project.build.sourceEncodingflink.version1.17.1/flink.versionscala.binary.version2.12/scala.binary.versionscala.version2.12.8/scala.version/propertiesdependencies!-- https://mvnrepository.com/artifact/joda-time/joda-time --dependencygroupIdjoda-time/groupIdartifactIdjoda-time/artifactIdversion2.12.5/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-avro/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-runtime-web/artifactIdversion${flink.version}/version/dependency!-- https://mvnrepository.com/artifact/com.alibaba.fastjson2/fastjson2 --dependencygroupIdcom.alibaba.fastjson2/groupIdartifactIdfastjson2/artifactIdversion2.0.33/version/dependency!-- https://mvnrepository.com/artifact/com.alibaba/fastjson --dependencygroupIdcom.alibaba/groupIdartifactIdfastjson/artifactIdversion1.2.83/version !-- version1.2.17/version--/dependencydependencygroupIdjunit/groupIdartifactIdjunit/artifactIdversion3.8.1/versionscopetest/scope/dependency!-- https://mvnrepository.com/artifact/org.apache.flink/flink-table-common --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-common/artifactIdversion${flink.version}/version/dependency!-- 引入flink1.13.0 scala2.12.12 --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-kafka/artifactIdversion${flink.version}/versionscopeprovided/scope/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-json/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-scala_${scala.binary.version}/artifactIdversion${flink.version}/versionscopeprovided/scope/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-streaming-scala_${scala.binary.version}/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-csv/artifactIdversion${flink.version}/version/dependency!-- Either... --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-api-java-bridge/artifactIdversion${flink.version}/version/dependency!-- or... --!-- 下面几个是代码中写sql需要的包 四个中一个都不能少 --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-api-scala-bridge_${scala.binary.version}/artifactIdversion${flink.version}/version/dependency!-- https://mvnrepository.com/artifact/org.apache.flink/flink-table-planner-loader --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-planner-loader/artifactIdversion${flink.version}/version !-- scopetest/scope--/dependency!-- https://mvnrepository.com/artifact/org.apache.flink/flink-table-runtime --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-table-runtime/artifactIdversion${flink.version}/versionscopeprovided/scope/dependency!-- https://mvnrepository.com/artifact/org.apache.flink/flink-connector-files --dependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-files/artifactIdversion${flink.version}/version/dependency!-- 注意: flink-table-planner-loader 不能和 flink-table-planner_${scala.binary.version} 共存--!-- dependency--!-- groupIdorg.apache.flink/groupId--!-- artifactIdflink-table-planner_${scala.binary.version}/artifactId--!-- version${flink.version}/version--!-- scopeprovided/scope--!-- /dependency--dependencygroupIdorg.apache.flink/groupIdartifactIdflink-clients/artifactIdversion${flink.version}/version/dependencydependencygroupIdorg.apache.flink/groupIdartifactIdflink-connector-jdbc/artifactIdversion3.1.0-1.17/versionscopeprovided/scope/dependencydependencygroupIdmysql/groupIdartifactIdmysql-connector-java/artifactIdversion8.0.11/version/dependency/dependenciesbuildplugins!-- 打jar插件 --plugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-shade-plugin/artifactIdversion2.4.3/versionexecutionsexecutionphasepackage/phasegoalsgoalshade/goal/goalsconfigurationfiltersfilterartifact*:*/artifactexcludesexcludeMETA-INF/*.SF/excludeexcludeMETA-INF/*.DSA/excludeexcludeMETA-INF/*.RSA/exclude/excludes/filter/filters/configuration/execution/executions/pluginplugingroupIdorg.scala-tools/groupIdartifactIdmaven-scala-plugin/artifactIdversion2.15.2/versionexecutionsexecutiongoalsgoalcompile/goalgoaltestCompile/goal/goals/execution/executions/pluginplugingroupIdnet.alchim31.maven/groupIdartifactIdscala-maven-plugin/artifactIdversion3.2.2/versionexecutionsexecutionidscala-compile-first/idphaseprocess-resources/phasegoalsgoaladd-source/goalgoalcompile/goal/goals/execution/executionsconfigurationscalaVersion${scala.version}/scalaVersion/configuration/pluginplugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-assembly-plugin/artifactIdversion2.5.5/versionconfiguration!--这部分可有可无,加上的话则直接生成可运行jar包--!--archive--!--manifest--!--mainClass${exec.mainClass}/mainClass--!--/manifest--!--/archive--descriptorRefsdescriptorRefjar-with-dependencies/descriptorRef/descriptorRefs/configuration/pluginplugingroupIdorg.apache.maven.plugins/groupIdartifactIdmaven-compiler-plugin/artifactIdversion3.1/versionconfigurationsource11/sourcetarget11/target/configuration/plugin/plugins/build /project
http://www.dnsts.com.cn/news/154669.html

相关文章:

  • 哪些网站是做快消品的网站的网页建设知识ppt模板
  • 中国建设银行内部网站优酷网站谁做的
  • 网站截图可以做凭证吗wordpress登录安全插件
  • 后台网站开发文档建设网站图片大全
  • 用订制音乐网站做的音乐算原创吗门店客户管理软件
  • 邢台 建网站wordpress文章驳回
  • 无锡网站网页设计培训甘肃住房城乡建设厅网站首页
  • 网站建设团队与分工wordpress数据库怎么连接数据库
  • 汕头市做网站优化创意 wordpress主题
  • 影视网站怎么做内链中国佛山营销网站建设
  • 广东省住房城乡建设厅门户网站怎么联系企业的网站建设
  • 成都网站推广海南企业seo推广
  • 瓯北网站制作汽车cms系统是什么意思
  • 网站建设栏目内容徐州网络公司排名
  • 北京市网站建设产品线上推广方式都有哪些
  • 阿里云建网站建设网上银行登录
  • 特色的武进网站建设免费帮朋友做网站
  • 网站建设专业术语自营店网站建设
  • 女性做网站很有名的php做简单网站 多久
  • 网站怎么做电脑系统百度指数搜索
  • 集团公司网站源码下载太原市建设银行网站
  • django做的购物网站怎么做返利网站吗
  • 网站一直做竞价么关于企业的网站
  • 建设电商网站app制作定制外包88
  • 怎样创建网站数据库wordpress 响应式 框架
  • jquery win8风格企业网站模板wordpress彩带背景
  • 三里河网站建设域名注册阿里云
  • 北京网站柳州学校网站建设
  • 海口企业做网站设计如何进行关键词分析
  • 深圳网站建设服务哪个便宜啊搜狗推广开户