Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

new File --> file #354

Merged
merged 3 commits into from
Jul 3, 2019
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
3 changes: 3 additions & 0 deletions CHANGELOG.md
Original file line number Diff line number Diff line change
Expand Up @@ -26,6 +26,9 @@
* Add proper `nf-core` logo for tools
* Add `Quick Start` section to main README of template
* Fix [Docker RunOptions](https://github.com/nf-core/tools/pull/351) to get UID and GID set in the template
* Use [`file`](https://github.com/nf-core/tools/pull/354) instead of `new File`
to avoid weird behavior such as making an `s3:/` directory locally when using
an AWS S3 bucket as the `--outdir`.

#### Other

Expand Down
58 changes: 29 additions & 29 deletions nf_core/pipeline-template/{{cookiecutter.name_noslash}}/main.nf
Original file line number Diff line number Diff line change
Expand Up @@ -49,7 +49,7 @@ def helpMessage() {
* SET UP CONFIGURATION VARIABLES
*/

// Show help message
// Show help emssage
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
// Show help emssage
// Show help message

if (params.help){
helpMessage()
exit 0
Expand All @@ -65,7 +65,7 @@ if (params.genomes && params.genome && !params.genomes.containsKey(params.genome
fasta = params.genome ? params.genomes[ params.genome ].fasta ?: false : false
if ( params.fasta ){
fasta = file(params.fasta)
if ( !fasta.exists() ) exit 1, "Fasta file not found: ${params.fasta}"
if( !fasta.exists() ) exit 1, "Fasta file not found: ${params.fasta}"
}
//
// NOTE - THIS IS NOT USED IN THIS PIPELINE, EXAMPLE ONLY
Expand All @@ -78,12 +78,12 @@ if ( params.fasta ){
// Has the run name been specified by the user?
// this has the bonus effect of catching both -name and --name
custom_runName = params.name
if ( !(workflow.runName ==~ /[a-z]+_[a-z]+/) ){
if( !(workflow.runName ==~ /[a-z]+_[a-z]+/) ){
custom_runName = workflow.runName
}


if ( workflow.profile == 'awsbatch') {
if( workflow.profile == 'awsbatch') {
// AWSBatch sanity checking
if (!params.awsqueue || !params.awsregion) exit 1, "Specify correct --awsqueue and --awsregion parameters on AWSBatch!"
// Check outdir paths to be S3 buckets if running on AWSBatch
Expand All @@ -100,8 +100,8 @@ ch_output_docs = Channel.fromPath("$baseDir/docs/output.md")
/*
* Create a channel for input read files
*/
if (params.readPaths){
if (params.singleEnd){
if(params.readPaths){
if(params.singleEnd){
Channel
.from(params.readPaths)
.map { row -> [ row[0], [file(row[1][0])]] }
Expand All @@ -125,28 +125,28 @@ if (params.readPaths){
// Header log info
log.info nfcoreHeader()
def summary = [:]
if (workflow.revision) summary['Pipeline Release'] = workflow.revision
if(workflow.revision) summary['Pipeline Release'] = workflow.revision
summary['Run Name'] = custom_runName ?: workflow.runName
// TODO nf-core: Report custom parameters here
summary['Reads'] = params.reads
summary['Fasta Ref'] = params.fasta
summary['Data Type'] = params.singleEnd ? 'Single-End' : 'Paired-End'
summary['Max Resources'] = "$params.max_memory memory, $params.max_cpus cpus, $params.max_time time per job"
if (workflow.containerEngine) summary['Container'] = "$workflow.containerEngine - $workflow.container"
if(workflow.containerEngine) summary['Container'] = "$workflow.containerEngine - $workflow.container"
summary['Output dir'] = params.outdir
summary['Launch dir'] = workflow.launchDir
summary['Working dir'] = workflow.workDir
summary['Script dir'] = workflow.projectDir
summary['User'] = workflow.userName
if (workflow.profile == 'awsbatch'){
if(workflow.profile == 'awsbatch'){
summary['AWS Region'] = params.awsregion
summary['AWS Queue'] = params.awsqueue
}
summary['Config Profile'] = workflow.profile
if (params.config_profile_description) summary['Config Description'] = params.config_profile_description
if (params.config_profile_contact) summary['Config Contact'] = params.config_profile_contact
if (params.config_profile_url) summary['Config URL'] = params.config_profile_url
if (params.email) {
if(params.config_profile_description) summary['Config Description'] = params.config_profile_description
if(params.config_profile_contact) summary['Config Contact'] = params.config_profile_contact
if(params.config_profile_url) summary['Config URL'] = params.config_profile_url
if(params.email) {
summary['E-mail Address'] = params.email
summary['MultiQC maxsize'] = params.maxMultiqcEmailFileSize
}
Expand Down Expand Up @@ -279,7 +279,7 @@ workflow.onComplete {

// Set up the e-mail variables
def subject = "[{{ cookiecutter.name }}] Successful: $workflow.runName"
if (!workflow.success){
if(!workflow.success){
subject = "[{{ cookiecutter.name }}] FAILED: $workflow.runName"
}
def email_fields = [:]
Expand All @@ -298,10 +298,10 @@ workflow.onComplete {
email_fields['summary']['Date Completed'] = workflow.complete
email_fields['summary']['Pipeline script file path'] = workflow.scriptFile
email_fields['summary']['Pipeline script hash ID'] = workflow.scriptId
if (workflow.repository) email_fields['summary']['Pipeline repository Git URL'] = workflow.repository
if (workflow.commitId) email_fields['summary']['Pipeline repository Git Commit'] = workflow.commitId
if (workflow.revision) email_fields['summary']['Pipeline Git branch/tag'] = workflow.revision
if (workflow.container) email_fields['summary']['Docker image'] = workflow.container
if(workflow.repository) email_fields['summary']['Pipeline repository Git URL'] = workflow.repository
if(workflow.commitId) email_fields['summary']['Pipeline repository Git Commit'] = workflow.commitId
if(workflow.revision) email_fields['summary']['Pipeline Git branch/tag'] = workflow.revision
if(workflow.container) email_fields['summary']['Docker image'] = workflow.container
email_fields['summary']['Nextflow Version'] = workflow.nextflow.version
email_fields['summary']['Nextflow Build'] = workflow.nextflow.build
email_fields['summary']['Nextflow Compile Timestamp'] = workflow.nextflow.timestamp
Expand Down Expand Up @@ -341,7 +341,7 @@ workflow.onComplete {
// Send the HTML e-mail
if (params.email) {
try {
if ( params.plaintext_email ){ throw GroovyException('Send plaintext e-mail, not HTML') }
if( params.plaintext_email ){ throw GroovyException('Send plaintext e-mail, not HTML') }
// Try to send HTML e-mail using sendmail
[ 'sendmail', '-t' ].execute() << sendmail_html
log.info "[{{ cookiecutter.name }}] Sent summary e-mail to $params.email (sendmail)"
Expand All @@ -353,27 +353,27 @@ workflow.onComplete {
}

// Write summary e-mail HTML to a file
def output_d = new File( "${params.outdir}/pipeline_info/" )
if ( !output_d.exists() ) {
def output_d = file( "${params.outdir}/pipeline_info/" )
if( !output_d.exists() ) {
output_d.mkdirs()
}
def output_hf = new File( output_d, "pipeline_report.html" )
def output_hf = file( output_d, "pipeline_report.html" )
output_hf.withWriter { w -> w << email_html }
def output_tf = new File( output_d, "pipeline_report.txt" )
def output_tf = file( output_d, "pipeline_report.txt" )
output_tf.withWriter { w -> w << email_txt }

c_reset = params.monochrome_logs ? '' : "\033[0m";
c_purple = params.monochrome_logs ? '' : "\033[0;35m";
c_green = params.monochrome_logs ? '' : "\033[0;32m";
c_red = params.monochrome_logs ? '' : "\033[0;31m";

if (workflow.stats.ignoredCount > 0 && workflow.success) {
if (workflow.stats.ignoredCountFmt > 0 && workflow.success) {
log.info "${c_purple}Warning, pipeline completed, but with errored process(es) ${c_reset}"
log.info "${c_red}Number of ignored errored process(es) : ${workflow.stats.ignoredCount} ${c_reset}"
log.info "${c_green}Number of successfully ran process(es) : ${workflow.stats.succeedCount} ${c_reset}"
log.info "${c_red}Number of ignored errored process(es) : ${workflow.stats.ignoredCountFmt} ${c_reset}"
log.info "${c_green}Number of successfully ran process(es) : ${workflow.stats.succeedCountFmt} ${c_reset}"
}

if (workflow.success){
if(workflow.success){
log.info "${c_purple}[{{ cookiecutter.name }}]${c_green} Pipeline completed successfully${c_reset}"
} else {
checkHostname()
Expand Down Expand Up @@ -411,11 +411,11 @@ def checkHostname(){
def c_white = params.monochrome_logs ? '' : "\033[0;37m"
def c_red = params.monochrome_logs ? '' : "\033[1;91m"
def c_yellow_bold = params.monochrome_logs ? '' : "\033[1;93m"
if (params.hostnames){
if(params.hostnames){
def hostname = "hostname".execute().text.trim()
params.hostnames.each { prof, hnames ->
hnames.each { hname ->
if (hostname.contains(hname) && !workflow.profile.contains(prof)){
if(hostname.contains(hname) && !workflow.profile.contains(prof)){
log.error "====================================================\n" +
" ${c_red}WARNING!${c_reset} You are running with `-profile $workflow.profile`\n" +
" but your machine hostname is ${c_white}'$hostname'${c_reset}\n" +
Expand Down