Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Tomcat关闭时,报线程无法关闭,有内存泄漏可能的警告 #497

Closed
lzg0409 opened this issue Mar 26, 2018 · 2 comments
Closed

Comments

@lzg0409
Copy link

lzg0409 commented Mar 26, 2018

Please answer these questions before submitting your issue. Thanks!
开源不易,我们希望将精力放在完成新功能和解决有价值的问题上,为了让大家的配合更具有效率,请填写以下列出的全部问题

Which version of Elastic-Job do you using?(您使用的Elastic-Job版本为?)

elasticjob-lite 2.1.5

Expected behavior (您预期的结果是)

Tomcat 关闭的时候能够正常结束任务

Actual behavior (实际运行的结果是)

Tomcat关闭时配置了-force参数

[root@trweb-ctc-bj-10-254-64-94 bin]# ./shutdown.sh; tail -f ../logs/catalina.out
Using CATALINA_BASE: /data/omsgift-job-8092
Using CATALINA_HOME: /data/omsgift-job-8092
Using CATALINA_TMPDIR: /data/omsgift-job-8092/temp
Using JRE_HOME: /usr
Using CLASSPATH: /data/omsgift-job-8092/bin/bootstrap.jar:/data/omsgift-job-8092/bin/tomcat-juli.jar
Using CATALINA_PID: /data/omsgift-job-8092/CATALINA_PID
Killing Tomcat with the PID: 29702
The Tomcat process has been killed.
Mar 26, 2018 10:59:47 AM org.apache.catalina.loader.WebappClassLoader clearReferencesThreads
SEVERE: The web application [] appears to have started a thread named [statusBindExcetptionJob_QuartzSchedulerThread] but has failed to stop it. This is very likely to create a memory leak.
Mar 26, 2018 10:59:47 AM org.apache.coyote.AbstractProtocol stop
INFO: Stopping ProtocolHandler ["http-bio-8092"]
Mar 26, 2018 10:59:47 AM org.apache.coyote.AbstractProtocol stop
INFO: Stopping ProtocolHandler ["ajp-bio-8101"]
Mar 26, 2018 10:59:47 AM org.apache.coyote.AbstractProtocol destroy
INFO: Destroying ProtocolHandler ["http-bio-8092"]
Mar 26, 2018 10:59:47 AM org.apache.coyote.AbstractProtocol destroy
INFO: Destroying ProtocolHandler ["ajp-bio-8101"]

tomcat提示无法结束线程,可能会导致内存溢出。

Steps to reproduce the behavior (可重现问题的操作步骤)

执行tomcat shutdown.sh时会出现

Please provide the reproduce example codes (such as github link),otherwise we will label the issue as Invalid and close it.(为了节省复现问题的时间,请务必提供可重现的代码,否则我们会将issue直接标记为invalid并关闭)

<job:simple id="statusBindExcetptionJob" class="com.gift.job.service.elastic.StatusBindExcetptionJob"
	sharding-total-count="${statusBindExcetptionJob.shardingTotalCount}"
	registry-center-ref="chunbo-job"
	cron="${statusBindExcetptionJob.cron}" 
	sharding-item-parameters="${statusBindExcetptionJob.shardingItemParameters}"
	monitor-execution="${statusBindExcetptionJob.monitorExecution}"
	monitor-port="${statusBindExcetptionJob.monitorPort}"
	failover="${statusBindExcetptionJob.failover}" 
	description="${statusBindExcetptionJob.description}"
	disabled="${statusBindExcetptionJob.disabled}" 
	overwrite="${statusBindExcetptionJob.overwrite}"
	job-parameter="${statusBindExcetptionJob.jobParameter}"/>			

对应的配置项
#7 bindStatus exception send email
statusBindExcetptionJob.cron=0 0 1 * * ? *
statusBindExcetptionJob.shardingTotalCount=2
statusBindExcetptionJob.shardingItemParameters=0=0,1=1
statusBindExcetptionJob.monitorExecution=true
statusBindExcetptionJob.misfire=true
statusBindExcetptionJob.failover=true
statusBindExcetptionJob.overwrite=true
statusBindExcetptionJob.disabled=false
statusBindExcetptionJob.description=bindStatus exception send email
statusBindExcetptionJob.processCountIntervalSeconds=5
statusBindExcetptionJob.concurrentDataProcessThreadCount=1
statusBindExcetptionJob.monitorPort=9887
statusBindExcetptionJob.jobParameter=easyeasy

Code should based on https://github.com/elasticjob/elastic-job-example
(代码请基于 https://github.com/elasticjob/elastic-job-example)

@kamping
Copy link

kamping commented Apr 8, 2018

楼主,请问你解决了这个问题吗?

@baiyunpeng1991
Copy link

这个是因为bean没有得到正确的销毁导致的,gc没办法回收掉强引用

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests

4 participants