This is the multi-page printable view of this section. Click here to print.

Return to the regular view of this page.

Compile Command

Detailed documentation for the bash-compiler compile command
The `compile` command is a powerful feature of the bash-compiler that allows you to compile your bash scripts into executable binaries. This command takes all your independent bash snippets and transforms them into a single file containing all needed dependencies that can be executed directly on your system.

1. Alternatives

2. Browse Lists

Select a list from the sidebar to explore resources.

Articles in this section

TitleDescriptionUpdated
DevelopmentGuidelines for developing and contributing to bash-compiler2026-03-01
Technical architectureTechnical architecture of bash-compiler2026-02-24
Compile CommandDetailed documentation for the bash-compiler compile command2026-02-14

1 - Compile Command

Detailed documentation for the bash-compiler compile command

1. What is it ?

1.1. Why ?

Before answering what this tool is, let’s begin by why this tool exists ?

Distributing scripts that rely on multiple sourced files is inconvenient, as users must unpack and run them from specific locations. To improve this, the goal is to create a single script file that embeds all necessary files. This approach ensures:

  • Easy distribution as a single file
  • Reliable copy-paste across systems and editors
  • Support for embedding binary (non-printable) files
  • Reusability of bash functions

To achieve this, the script must store other files’ contents in a way that avoids non-printable characters, which can cause issues during copy-paste or transmission over messaging programs.

1.2. what is a bash framework function ?

The so called bash framework functions are the functions defined in this framework that respects the following naming convention:

  • Namespace::Namespace::functionName
    • we can have any number of namespaces
    • each namespace is followed by ::
    • namespace must begin by an uppercase letter [A-Z] followed by any of these characters [A-Za-z0-9_-].
    • the function name is traditionally written using camelCase with first letter in small case
    • function name authorized characters are [a-zA-Z0-9_-]+
  • the function source code using namespace convention will be searched under srcDirs provided to the compiler via –src-dir argument
    • each namespace corresponds to a folder
    • the filename of the function is the function name with .sh extension
    • eg: Filters::camel2snakeCase source code can be found in src/Filters/camel2snakeCase.sh

1.3. what does this compiler tool do ?

This tool allows to detect all the framework functions used inside a given sh file. The framework functions matches the pattern Namespace::functionName (we can have several namespaces separated by the characters ::). These framework functions will be injected inside a compiled file. The process is recursive so that every framework functions used by imported framework functions will be imported as well (of course only once).

You can see several examples of compiled files by checking bash-tools-framework src/_binaries folder

2. The compile command

Description: This command inlines all the functions used in the script given in parameter

Usage:

bin/compile` [-h|--help] prints this help and exits

Usage:

bin/compile <fileToCompile>
            [--src-dir|-s <srcDir>]
            [--bin-dir|-b <binDir>] [--bin-file|-f <binFile>]
            [--root-dir|-r <rootDir>] [--src-path <srcPath>]
            [--template <templateName>] [--keep-temp-files|-k]

Mandatory Arguments:

  • <fileToCompile> the relative or absolute path to compile into one file

Options:

  • --help,-h prints this help and exits

  • --src-dir|-s <srcDir> provide the directory where to find the functions source code. By default this project src directory is used.

    You can add as much –src-dir options as needed to define other source dirs.

    The functions will be searched in the order defined (it allows function redefinition)

    Example: --src-dir src --src-dir otherSrc

    Functions::myFunction will be searched in

    • src/Functions/myFunction.sh
    • otherSrc/Functions/myFunction.sh

    Important Note: if you provide a --src-dir and you need also functions defined in this project, think about adding a --src-dir for this project too.

  • --bin-dir|-b <binDir> allows to override the value of FRAMEWORK_BIN_DIR. By default FRAMEWORK_BIN_DIR is set to bin directory below the folder above bin/compile.

  • --bin-file|-f <binFile> BIN_FILE directive will be overridden by binFile value. See more information below about directives.

  • --template-dir|-t <templateDir> the template directory used to override some template includes. See more information below about environment variables.

  • --root-dir|-r <rootDir> if you wish to override FRAMEWORK_ROOT_DIR variable.

    By default root directory is the folder above bin/compile.

  • --src-path <path> if you wish to override the filepath that will be displayed in the header to indicate the src filepath that has been compiled (SRC_FILE_PATH).

    By default, it is initialized with path relative to FRAMEWORK_ROOT_DIR

  • --keep-temp-files|-k keep temporary files for debug purpose

Examples:

Let’s say you want to generate the binary file bin/shellcheckLint from the source file examples/configReference/shellcheckLint.yaml

bin/compile "$(pwd)/examples/configReference/shellcheckLint.yaml" --src-dir "$(pwd)/src" \
  --bin-dir "$(pwd)/bin" --root-dir "$(pwd)"

Here you want to generate the binary but overriding some or all functions of vendor/bash-tools-framework/src using src folder

bin/compile "$(pwd)/examples/configReference/shellcheckLint.yaml" --s "$(pwd)/src" \
  -s "$(pwd)/vendor/bash-tools-framework/src" --bin-dir "$(pwd)/bin" --root-dir "$(pwd)"

Here you want to override the default templates too

bin/compile "$(pwd)/examples/configReference/shellcheckLint.yaml" --s "$(pwd)/src" \
  -s "$(pwd)/vendor/bash-tools-framework/src" --bin-dir "$(pwd)/bin" \
  --root-dir "$(pwd)" --template-dir "$(pwd)/src/templates"

3. The compiler - How this compiler works ?

3.1. Template variables

Other variables are automatically generated to be used in your templates:

  • ORIGINAL_TEMPLATE_DIR allowing you to include the template relative to the script being interpreted
  • TEMPLATE_DIR the template directory in which you can override the templates defined in ORIGINAL_TEMPLATE_DIR

The following variables depends upon parameters passed to this script:

  • SRC_FILE_PATH the src file you want to show at the top of generated file to indicate from which source file the binary has been generated.
  • SRC_ABSOLUTE_PATH is the path of the file being compiled, it can be useful if you need to access a path relative to this file during compilation.

3.2. Compiler Algorithm

The command to generate a bash binary file:

./bin/bash-compiler examples/configReference/shellcheckLint.yaml \
  --root-dir /home/wsl/fchastanet/bash-dev-env/vendor/bash-tools-framework \
  --target-dir examples/generated \
  --keep-intermediate-files

This will trigger the following actions

activity diagram to explain how compile command works
Source code: Activity diagram
@startuml "compiler"
title compiler algorithm
skinparam {
  ' https://github.com/plantuml/plantuml/blob/49115dfc7d4156961e5b49a81c09b474daa79823/src/net/sourceforge/plantuml/style/FromSkinparamToStyle.java#L145
  activityDiamondBackgroundColor #AAAAAA
  activityEndColor #red
}

start

:compile binaryModelYamlFile;

#SpringGreen:model.LoadBinaryModel binaryModelYamlFile;

if (ok) then (binaryModel)
else (error)
  stop
endif

partition "compiler.GenerateCode" {

  #SpringGreen:compiler.GenerateCode binaryModel;

  :loadGoTemplate;

  :renderGoTemplate;

  note right
    using binaryModel data
    only commands part for the moment
  endnote

  if (ok) then (code)
  else (error)
    stop
  endif
}

partition "compiler.Compile" {

  :compiler.extractUniqueFrameworkFunctions &functionsMap, code;

  partition "compiler.retrieveEachFunctionPath" #LightSkyBlue {
    :compiler.retrieveEachFunctionPath &functionsMap, binaryModel.BinFile.SrcDirs;
    note right
      for each function retrieve the full src file path
      try to get also _.sh and ZZZ.sh files if exists
    endnote
    repeat :each function of functionsMap;
      if (function) then (not already retrieved)
        :compile.findFunctionInSrcDirs function, binaryModel.BinFile.SrcDirs;
        if (file not found) then (error)
          stop
        endif
        :register function src file;
        :register _.sh if exists;
        :register ZZZ.sh if exists;
      endif
    repeat while (more function?)
  }

  partition "compiler.retrieveAllFunctionsContent" #LightSkyBlue {
    :compiler.retrieveAllFunctionsContent &functionsMap;
    note right
      for each function retrieve content of each src file path
    endnote
  }


  partition "Compiler::Require::requires" #LightSkyBlue {

    #SpringGreen:Compiler::Require::filter scriptFile;
    while (requireDirective?) is (<color:green>require directive found)
      -[#green]->
      #SpringGreen:Compiler::Require::parse $requireDirective
        ~~uses Compiler::Require::assertInterface~~
      ;
      if (implement directive can't be parsed or function does not exist?) is (<color:red>invalid directive) then
        -[#red]->
        end
      else
        -[#green]->
      endif
      -[#green]->
      :Compiler::Require::parse;
    endwhile (no more require\ndirective to process)
  }

  :import functions from 2 previous task
  and inject them before # FUNCTIONS token;
}

-[#green,dashed]-> compiler
process
continues
;

partition "Compiler::Require::requires" #LightSkyBlue {
  note right
  **second phase**
  call again Compiler::Require::requires
  to get final list of requires in reverse order
  to add the calls to those require functions
  just after the token ~~# REQUIRES~~
  endnote
  :File::insertFileAfterToken requires "# REQUIRES";
}

partition "compilerEnds" #pink {
  :Compiler::Implement::mergeInterfacesFunctions;
  :Compiler::Implement::validateInterfaceFunctions;
  :Output result;
}

end
@enduml

3.3. Class diagram

bash-compiler class diagram
Source code: bash-compiler class diagram
@startuml
legend
<u><b>Legend</b></u>
Render Aggregations: true
Render Compositions: true
Render Implementations: true
Render Connections: false
Render Fields: true
Render Methods: true
Private Aggregations: true
end legend
namespace compiler {
    interface AnnotationProcessorInterface  {
        + Init(compileContextData *CompileContextData) error
        + ParseFunction(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error
        + Process(compileContextData *CompileContextData) error
        + PostProcess(compileContextData *CompileContextData, code string) (string, error)

    }
    class CompileContext << (S,Aquamarine) >> {
        + Init(templateContextData *render.TemplateContextData, config *model.CompilerConfig) (*CompileContextData, error)
        + Compile(compileContextData *CompileContextData, code string) (string, error)

    }
    class CompileContextData << (S,Aquamarine) >> {
        + Validate() error

    }
    class annotation << (S,Aquamarine) >> {
    }
    class annotationCastError << (S,Aquamarine) >> {
        + FunctionName string

        + Error() string

    }
    class annotationEmbedGenerate << (S,Aquamarine) >> {
        + RenderResource(asName string, resource string, lineNumber int) (string, error)

    }
    interface annotationEmbedGenerateInterface  {
        + RenderResource(asName string, resource string, lineNumber int) (string, error)

    }
    class annotationProcessor << (S,Aquamarine) >> {
    }
    class compiler.InsertPosition << (T, #FF7700) >>  {
    }
    class duplicatedAsNameError << (S,Aquamarine) >> {
        + Error() string

    }
    class duplicatedFunctionsDirectiveError << (S,Aquamarine) >> {
        + LineNumber int

        + Error() string

    }
    class embedAnnotationProcessor << (S,Aquamarine) >> {
        + Init(compileContextData *CompileContextData) error
        + ParseFunction(_ *CompileContextData, _ *functionInfoStruct) error
        + Process(_ *CompileContextData) error
        + PostProcess(_ *CompileContextData, code string) (string, error)

    }
    class functionInfoStruct << (S,Aquamarine) >> {
        + FunctionName string
        + SrcFile string
        + SourceCode string
        + AnnotationMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}
        + Inserted bool
        + InsertPosition InsertPosition
        + SourceCodeLoaded bool
        + SourceCodeAsTemplate bool

    }
    class functionNotFoundError << (S,Aquamarine) >> {
        + FunctionName string
        + SrcDirs []string

        + Error() string

    }
    class requireAnnotation << (S,Aquamarine) >> {
    }
    class requireAnnotationProcessor << (S,Aquamarine) >> {
        + Init(compileContextData *CompileContextData) error
        + ParseFunction(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error
        + Process(compileContextData *CompileContextData) error
        + PostProcess(_ *CompileContextData, code string) (string, error)

    }
    class requiredFunctionNotFoundError << (S,Aquamarine) >> {
        + Error() string

    }
    class unsupportedEmbeddedResourceError << (S,Aquamarine) >> {
        + Error() string

    }
}
"__builtin__.error" *-- "extends""compiler.annotationCastError"
"__builtin__.error" *-- "extends""compiler.duplicatedAsNameError"
"__builtin__.error" *-- "extends""compiler.duplicatedFunctionsDirectiveError"
"compiler.annotationProcessor" *-- "extends""compiler.embedAnnotationProcessor"
"__builtin__.error" *-- "extends""compiler.functionNotFoundError"
"compiler.annotation" *-- "extends""compiler.requireAnnotation"
"compiler.annotationProcessor" *-- "extends""compiler.requireAnnotationProcessor"
"__builtin__.error" *-- "extends""compiler.requiredFunctionNotFoundError"
"__builtin__.error" *-- "extends""compiler.unsupportedEmbeddedResourceError"

"services.CodeCompilerInterface" <|-- "implements""compiler.CompileContext"
"compiler.annotationEmbedGenerateInterface" <|-- "implements""compiler.annotationEmbedGenerate"
"compiler.AnnotationProcessorInterface" <|-- "implements""compiler.embedAnnotationProcessor"
"compiler.AnnotationProcessorInterface" <|-- "implements""compiler.requireAnnotationProcessor"

"compiler.CompileContext""uses" o-- "compiler.AnnotationProcessorInterface"
"compiler.CompileContext""uses" o-- "render.TemplateContextInterface"
"compiler.CompileContextData""uses" o-- "compiler.CompileContext"
"compiler.CompileContextData""uses" o-- "compiler.functionInfoStruct"
"compiler.CompileContextData""uses" o-- "model.CompilerConfig"
"compiler.CompileContextData""uses" o-- "regexp.Regexp"
"compiler.CompileContextData""uses" o-- "render.TemplateContextData"
"compiler.annotationEmbedGenerate""uses" o-- "render.TemplateContextData"
"compiler.embedAnnotationProcessor""uses" o-- "compiler.annotationEmbedGenerateInterface"
"compiler.functionInfoStruct""uses" o-- "compiler.InsertPosition"
"compiler.requireAnnotationProcessor""uses" o-- "compiler.CompileContextData"

namespace errors {
    class ValidationError << (S,Aquamarine) >> {
        + InnerError error
        + Context string
        + FieldName string
        + FieldValue any

        + Error() string

    }
}


"errors.ValidationError""uses" o-- "errors.any"

namespace files {
    class directoryPathMissingError << (S,Aquamarine) >> {
        + DirPath string

        + Error() string

    }
    class directoryWasExpectedError << (S,Aquamarine) >> {
        + Directory string

        + Error() string

    }
    class filePathMissingError << (S,Aquamarine) >> {
        + FilePath string

        + Error() string

    }
    class fileWasExpectedError << (S,Aquamarine) >> {
        + File string

        + Error() string

    }
}
"__builtin__.error" *-- "extends""files.directoryPathMissingError"
"__builtin__.error" *-- "extends""files.directoryWasExpectedError"
"__builtin__.error" *-- "extends""files.filePathMissingError"
"__builtin__.error" *-- "extends""files.fileWasExpectedError"



namespace main {
    class Directory << (S,Aquamarine) >> {
        + Validate() error

    }
    class VersionFlag << (S,Aquamarine) >> {
        + Decode(_ *kong.DecodeContext) error
        + IsBool() bool
        + BeforeApply(app *kong.Kong, vars kong.Vars) error

    }
    class YamlFiles << (S,Aquamarine) >> {
        + Validate() error

    }
    class cli << (S,Aquamarine) >> {
        + YamlFiles YamlFiles
        + TargetDir Directory
        + Version VersionFlag
        + KeepIntermediateFiles bool
        + Debug bool
        + LogLevel int
        + CompilerRootDir Directory

    }
    class getCurrentFilenameError << (S,Aquamarine) >> {
        + Error() string

    }
    class main.Directory << (T, #FF7700) >>  {
    }
    class main.VersionFlag << (T, #FF7700) >>  {
    }
    class main.YamlFiles << (T, #FF7700) >>  {
    }
}
"__builtin__.error" *-- "extends""main.getCurrentFilenameError"


"main.cli""uses" o-- "main.Directory"
"main.cli""uses" o-- "main.VersionFlag"
"main.cli""uses" o-- "main.YamlFiles"

namespace model {
    class BinaryModel << (S,Aquamarine) >> {
        + CompilerConfig CompilerConfig
        + Vars structures.Dictionary
        + BinData <font color=blue>interface</font>{}

    }
    class BinaryModelLoader << (S,Aquamarine) >> {
        + Load(targetDir string, binaryModelFilePath string, binaryModelBaseName string, referenceDir string, keepIntermediateFiles bool) (*BinaryModel, error)

    }
    class CompilerConfig << (S,Aquamarine) >> {
        + AnnotationsConfig structures.Dictionary
        + TargetFile string
        + RelativeRootDirBasedOnTargetDir string
        + CommandDefinitionFiles []string
        + TemplateFile string
        + TemplateDirs []string
        + FunctionsIgnoreRegexpList []string
        + SrcDirs []string
        + SrcDirsExpanded []string

    }
}

"services.BinaryModelLoaderInterface" <|-- "implements""model.BinaryModelLoader"

"model.BinaryModel""uses" o-- "model.CompilerConfig"
"model.BinaryModel""uses" o-- "structures.Dictionary"
"model.CompilerConfig""uses" o-- "structures.Dictionary"

namespace render {
    class TemplateContext << (S,Aquamarine) >> {
        + Init(templateDirs []string, templateFile string, data <font color=blue>interface</font>{}, funcMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}) (*TemplateContextData, error)
        + Render(templateContextData *TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateName(templateContextData *TemplateContextData) (string, error)
        + RenderFromTemplateContent(templateContextData *TemplateContextData, templateContent string) (string, error)

    }
    class TemplateContextData << (S,Aquamarine) >> {
        + TemplateContext TemplateContextInterface
        + TemplateName *string
        + Template templateInterface
        + RootData <font color=blue>interface</font>{}
        + Data <font color=blue>interface</font>{}

    }
    interface TemplateContextInterface  {
        + Render(templateContextData *TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateContent(templateContextData *TemplateContextData, templateContent string) (string, error)

    }
    class fileNotFoundError << (S,Aquamarine) >> {
        + File string
        + SrcDirs []string

        + Error() string

    }
    class notSupportedTypeError << (S,Aquamarine) >> {
        + ObjectType string

        + Error() string

    }
    interface templateInterface  {
        + ExecuteTemplate(wr io.Writer, name string, data any) error
        + Parse(text string) (*template.Template, error)

    }
}
"__builtin__.error" *-- "extends""render.fileNotFoundError"
"__builtin__.error" *-- "extends""render.notSupportedTypeError"

"render.TemplateContextInterface" <|-- "implements""render.TemplateContext"
"services.TemplateContextInterface" <|-- "implements""render.TemplateContext"

"render.TemplateContextData""uses" o-- "render.TemplateContextInterface"
"render.TemplateContextData""uses" o-- "render.templateInterface"

namespace services {
    interface BinaryModelLoaderInterface  {
        + Load(targetDir string, binaryModelFilePath string, binaryModelBaseName string, referenceDir string, keepIntermediateFiles bool) (*model.BinaryModel, error)

    }
    class BinaryModelServiceContext << (S,Aquamarine) >> {
        + Init(targetDir string, keepIntermediateFiles bool, binaryModelFilePath string) (*BinaryModelServiceContextData, error)
        + Compile(binaryModelServiceContextData *BinaryModelServiceContextData) error

    }
    class BinaryModelServiceContextData << (S,Aquamarine) >> {
    }
    interface CodeCompilerInterface  {
        + Init(templateContextData *render.TemplateContextData, config *model.CompilerConfig) (*compiler.CompileContextData, error)
        + Compile(compileContextData *compiler.CompileContextData, code string) (string, error)

    }
    interface TemplateContextInterface  {
        + Init(templateDirs []string, templateFile string, data <font color=blue>interface</font>{}, funcMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}) (*render.TemplateContextData, error)
        + Render(templateContextData *render.TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateName(templateContextData *render.TemplateContextData) (string, error)
        + RenderFromTemplateContent(templateContextData *render.TemplateContextData, templateContent string) (string, error)

    }
}


"services.BinaryModelServiceContext""uses" o-- "services.BinaryModelLoaderInterface"
"services.BinaryModelServiceContext""uses" o-- "services.CodeCompilerInterface"
"services.BinaryModelServiceContext""uses" o-- "services.TemplateContextInterface"
"services.BinaryModelServiceContextData""uses" o-- "compiler.CompileContextData"
"services.BinaryModelServiceContextData""uses" o-- "model.BinaryModel"
"services.BinaryModelServiceContextData""uses" o-- "render.TemplateContextData"

namespace structures {
    class Dictionary << (S,Aquamarine) >> {
        + GetStringValue(key string) (string, error)
        + GetStringList(key string) ([]string, error)

    }
    class invalidValueTypeError << (S,Aquamarine) >> {
        + Value any

        + Error() string

    }
    class missingKeyError << (S,Aquamarine) >> {
        + Key string

        + Error() string

    }
    class structures.Dictionary << (T, #FF7700) >>  {
    }
}
"__builtin__.error" *-- "extends""structures.invalidValueTypeError"
"__builtin__.error" *-- "extends""structures.missingKeyError"


"structures.invalidValueTypeError""uses" o-- "structures.any"

@enduml
bash-compiler class diagram with private methods
Source code: bash-compiler class diagram with private methods
@startuml
legend
<u><b>Legend</b></u>
Render Aggregations: true
Render Fields: true
Render Methods: true
Private Aggregations: true
end legend
namespace compiler {
    interface AnnotationProcessorInterface  {
        + Init(compileContextData *CompileContextData) error
        + ParseFunction(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error
        + Process(compileContextData *CompileContextData) error
        + PostProcess(compileContextData *CompileContextData, code string) (string, error)

    }
    class CompileContext << (S,Aquamarine) >> {
        - templateContext render.TemplateContextInterface
        - annotationProcessors []AnnotationProcessorInterface

        - generateCode(compileContextData *CompileContextData, code string) (bool, string, error)
        - functionsAnalysis(compileContextData *CompileContextData, code string) error
        - renderEachFunctionAsTemplate(compileContextData *CompileContextData) error
        - isNonFrameworkFunction(compileContextData *CompileContextData, functionName string) bool
        - nonFrameworkFunctionRegexpCompile(compileContextData *CompileContextData)
        - generateFunctionCode(compileContextData *CompileContextData) (string, error)
        - insertFunctionsCode(compileContextData *CompileContextData, functionNames []string, buffer *bytes.Buffer, insertPosition InsertPosition) error
        - retrieveAllFunctionsContent(compileContextData *CompileContextData) (bool, error)
        - retrieveEachFunctionPath(compileContextData *CompileContextData) (bool, error)
        - extractUniqueFrameworkFunctions(compileContextData *CompileContextData, code string) bool
        - findFileInSrcDirs(compileContextData *CompileContextData, relativeFilePath string) (string, bool)

        + Init(templateContextData *render.TemplateContextData, config *model.CompilerConfig) (*CompileContextData, error)
        + Compile(compileContextData *CompileContextData, code string) (string, error)

    }
    class CompileContextData << (S,Aquamarine) >> {
        - compileContext *CompileContext
        - templateContextData *render.TemplateContextData
        - config *model.CompilerConfig
        - functionsMap <font color=blue>map</font>[string]functionInfoStruct
        - ignoreFunctionsRegexp []*regexp.Regexp

        + Validate() error

    }
    class annotation << (S,Aquamarine) >> {
    }
    class annotationCastError << (S,Aquamarine) >> {
        + FunctionName string

        + Error() string

    }
    class annotationEmbedGenerate << (S,Aquamarine) >> {
        - embedDirTemplateName string
        - embedFileTemplateName string
        - templateContextData *render.TemplateContextData

        - renderFile(asName string, resource string, fileMode os.FileMode) (string, error)
        - renderDir(asName string, resource string) (string, error)
        - renderTemplate(data <font color=blue>map</font>[string]string, templateName string) (string, error)

        + RenderResource(asName string, resource string, lineNumber int) (string, error)

    }
    interface annotationEmbedGenerateInterface  {
        + RenderResource(asName string, resource string, lineNumber int) (string, error)

    }
    class annotationProcessor << (S,Aquamarine) >> {
    }
    class compiler.InsertPosition << (T, #FF7700) >>  {
    }
    class duplicatedAsNameError << (S,Aquamarine) >> {
        - lineNumber int
        - asName string
        - resource string

        + Error() string

    }
    class duplicatedFunctionsDirectiveError << (S,Aquamarine) >> {
        + LineNumber int

        + Error() string

    }
    class embedAnnotationProcessor << (S,Aquamarine) >> {
        - annotationEmbedGenerate annotationEmbedGenerateInterface
        - embedMap <font color=blue>map</font>[string]string

        + Init(compileContextData *CompileContextData) error
        + ParseFunction(_ *CompileContextData, _ *functionInfoStruct) error
        + Process(_ *CompileContextData) error
        + PostProcess(_ *CompileContextData, code string) (string, error)

    }
    class functionInfoStruct << (S,Aquamarine) >> {
        + FunctionName string
        + SrcFile string
        + SourceCode string
        + AnnotationMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}
        + Inserted bool
        + InsertPosition InsertPosition
        + SourceCodeLoaded bool
        + SourceCodeAsTemplate bool

        - getRequireAnnotation() (*requireAnnotation, error)

    }
    class functionNotFoundError << (S,Aquamarine) >> {
        + FunctionName string
        + SrcDirs []string

        + Error() string

    }
    class requireAnnotation << (S,Aquamarine) >> {
        - requiredFunctions []string
        - isRequired bool
        - checkRequirementsCodeAdded bool
        - codeAddedOnRequiredFunctions bool

    }
    class requireAnnotationProcessor << (S,Aquamarine) >> {
        - compileContextData *CompileContextData
        - checkRequirementsTemplateName string
        - requireTemplateName string

        - addRequireCodeToEachRequiredFunctions(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error
        - addRequireCode(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error

        + Init(compileContextData *CompileContextData) error
        + ParseFunction(compileContextData *CompileContextData, functionStruct *functionInfoStruct) error
        + Process(compileContextData *CompileContextData) error
        + PostProcess(_ *CompileContextData, code string) (string, error)

    }
    class requiredFunctionNotFoundError << (S,Aquamarine) >> {
        - functionName string

        + Error() string

    }
    class unsupportedEmbeddedResourceError << (S,Aquamarine) >> {
        - asName string
        - resource string
        - lineNumber int

        + Error() string

    }
}
"__builtin__.error" *-- "extends""compiler.annotationCastError"
"__builtin__.error" *-- "extends""compiler.duplicatedAsNameError"
"__builtin__.error" *-- "extends""compiler.duplicatedFunctionsDirectiveError"
"compiler.annotationProcessor" *-- "extends""compiler.embedAnnotationProcessor"
"__builtin__.error" *-- "extends""compiler.functionNotFoundError"
"compiler.annotation" *-- "extends""compiler.requireAnnotation"
"compiler.annotationProcessor" *-- "extends""compiler.requireAnnotationProcessor"
"__builtin__.error" *-- "extends""compiler.requiredFunctionNotFoundError"
"__builtin__.error" *-- "extends""compiler.unsupportedEmbeddedResourceError"

"services.CodeCompilerInterface" <|-- "implements""compiler.CompileContext"
"compiler.annotationEmbedGenerateInterface" <|-- "implements""compiler.annotationEmbedGenerate"
"compiler.AnnotationProcessorInterface" <|-- "implements""compiler.embedAnnotationProcessor"
"compiler.AnnotationProcessorInterface" <|-- "implements""compiler.requireAnnotationProcessor"

"compiler.CompileContext""uses" o-- "compiler.AnnotationProcessorInterface"
"compiler.CompileContext""uses" o-- "render.TemplateContextInterface"
"compiler.CompileContextData""uses" o-- "compiler.CompileContext"
"compiler.CompileContextData""uses" o-- "compiler.functionInfoStruct"
"compiler.CompileContextData""uses" o-- "model.CompilerConfig"
"compiler.CompileContextData""uses" o-- "regexp.Regexp"
"compiler.CompileContextData""uses" o-- "render.TemplateContextData"
"compiler.annotationEmbedGenerate""uses" o-- "render.TemplateContextData"
"compiler.embedAnnotationProcessor""uses" o-- "compiler.annotationEmbedGenerateInterface"
"compiler.functionInfoStruct""uses" o-- "compiler.InsertPosition"
"compiler.requireAnnotationProcessor""uses" o-- "compiler.CompileContextData"

namespace errors {
    class ValidationError << (S,Aquamarine) >> {
        + InnerError error
        + Context string
        + FieldName string
        + FieldValue any

        + Error() string

    }
}


"errors.ValidationError""uses" o-- "errors.any"

namespace files {
    class directoryPathMissingError << (S,Aquamarine) >> {
        + DirPath string

        + Error() string

    }
    class directoryWasExpectedError << (S,Aquamarine) >> {
        + Directory string

        + Error() string

    }
    class filePathMissingError << (S,Aquamarine) >> {
        + FilePath string

        + Error() string

    }
    class fileWasExpectedError << (S,Aquamarine) >> {
        + File string

        + Error() string

    }
}
"__builtin__.error" *-- "extends""files.directoryPathMissingError"
"__builtin__.error" *-- "extends""files.directoryWasExpectedError"
"__builtin__.error" *-- "extends""files.filePathMissingError"
"__builtin__.error" *-- "extends""files.fileWasExpectedError"



namespace main {
    class Directory << (S,Aquamarine) >> {
        + Validate() error

    }
    class VersionFlag << (S,Aquamarine) >> {
        + Decode(_ *kong.DecodeContext) error
        + IsBool() bool
        + BeforeApply(app *kong.Kong, vars kong.Vars) error

    }
    class YamlFiles << (S,Aquamarine) >> {
        + Validate() error

    }
    class cli << (S,Aquamarine) >> {
        + YamlFiles YamlFiles
        + TargetDir Directory
        + Version VersionFlag
        + KeepIntermediateFiles bool
        + Debug bool
        + LogLevel int
        + CompilerRootDir Directory

    }
    class getCurrentFilenameError << (S,Aquamarine) >> {
        + Error() string

    }
    class main.Directory << (T, #FF7700) >>  {
    }
    class main.VersionFlag << (T, #FF7700) >>  {
    }
    class main.YamlFiles << (T, #FF7700) >>  {
    }
}
"__builtin__.error" *-- "extends""main.getCurrentFilenameError"


"main.cli""uses" o-- "main.Directory"
"main.cli""uses" o-- "main.VersionFlag"
"main.cli""uses" o-- "main.YamlFiles"

namespace model {
    class BinaryModel << (S,Aquamarine) >> {
        + CompilerConfig CompilerConfig
        + Vars structures.Dictionary
        + BinData <font color=blue>interface</font>{}

    }
    class BinaryModelLoader << (S,Aquamarine) >> {
        - setEnvVars(binaryModel *BinaryModel)
        - expandVars(binaryModel *BinaryModel)

        + Load(targetDir string, binaryModelFilePath string, binaryModelBaseName string, referenceDir string, keepIntermediateFiles bool) (*BinaryModel, error)

    }
    class CompilerConfig << (S,Aquamarine) >> {
        + AnnotationsConfig structures.Dictionary
        + TargetFile string
        + RelativeRootDirBasedOnTargetDir string
        + CommandDefinitionFiles []string
        + TemplateFile string
        + TemplateDirs []string
        + FunctionsIgnoreRegexpList []string
        + SrcDirs []string
        + SrcDirsExpanded []string

    }
}

"services.BinaryModelLoaderInterface" <|-- "implements""model.BinaryModelLoader"

"model.BinaryModel""uses" o-- "model.CompilerConfig"
"model.BinaryModel""uses" o-- "structures.Dictionary"
"model.CompilerConfig""uses" o-- "structures.Dictionary"

namespace render {
    class TemplateContext << (S,Aquamarine) >> {
        + Init(templateDirs []string, templateFile string, data <font color=blue>interface</font>{}, funcMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}) (*TemplateContextData, error)
        + Render(templateContextData *TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateName(templateContextData *TemplateContextData) (string, error)
        + RenderFromTemplateContent(templateContextData *TemplateContextData, templateContent string) (string, error)

    }
    class TemplateContextData << (S,Aquamarine) >> {
        + TemplateContext TemplateContextInterface
        + TemplateName *string
        + Template templateInterface
        + RootData <font color=blue>interface</font>{}
        + Data <font color=blue>interface</font>{}

    }
    interface TemplateContextInterface  {
        + Render(templateContextData *TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateContent(templateContextData *TemplateContextData, templateContent string) (string, error)

    }
    class fileNotFoundError << (S,Aquamarine) >> {
        + File string
        + SrcDirs []string

        + Error() string

    }
    class notSupportedTypeError << (S,Aquamarine) >> {
        + ObjectType string

        + Error() string

    }
    interface templateInterface  {
        + ExecuteTemplate(wr io.Writer, name string, data any) error
        + Parse(text string) (*template.Template, error)

    }
}
"__builtin__.error" *-- "extends""render.fileNotFoundError"
"__builtin__.error" *-- "extends""render.notSupportedTypeError"

"render.TemplateContextInterface" <|-- "implements""render.TemplateContext"
"services.TemplateContextInterface" <|-- "implements""render.TemplateContext"

"render.TemplateContextData""uses" o-- "render.TemplateContextInterface"
"render.TemplateContextData""uses" o-- "render.templateInterface"

namespace services {
    interface BinaryModelLoaderInterface  {
        + Load(targetDir string, binaryModelFilePath string, binaryModelBaseName string, referenceDir string, keepIntermediateFiles bool) (*model.BinaryModel, error)

    }
    class BinaryModelServiceContext << (S,Aquamarine) >> {
        - binaryModelLoader BinaryModelLoaderInterface
        - templateContext TemplateContextInterface
        - codeCompiler CodeCompilerInterface

        - renderBinaryCodeFromTemplate(binaryModelServiceContextData *BinaryModelServiceContextData) (string, error)
        - renderCode(binaryModelServiceContextData *BinaryModelServiceContextData) (string, error)

        + Init(targetDir string, keepIntermediateFiles bool, binaryModelFilePath string) (*BinaryModelServiceContextData, error)
        + Compile(binaryModelServiceContextData *BinaryModelServiceContextData) error

    }
    class BinaryModelServiceContextData << (S,Aquamarine) >> {
        - binaryModelData *model.BinaryModel
        - compileContextData *compiler.CompileContextData
        - templateContextData *render.TemplateContextData
        - targetDir string
        - keepIntermediateFiles bool
        - binaryModelFilePath string
        - binaryModelBaseName string

    }
    interface CodeCompilerInterface  {
        + Init(templateContextData *render.TemplateContextData, config *model.CompilerConfig) (*compiler.CompileContextData, error)
        + Compile(compileContextData *compiler.CompileContextData, code string) (string, error)

    }
    interface TemplateContextInterface  {
        + Init(templateDirs []string, templateFile string, data <font color=blue>interface</font>{}, funcMap <font color=blue>map</font>[string]<font color=blue>interface</font>{}) (*render.TemplateContextData, error)
        + Render(templateContextData *render.TemplateContextData, templateName string) (string, error)
        + RenderFromTemplateName(templateContextData *render.TemplateContextData) (string, error)
        + RenderFromTemplateContent(templateContextData *render.TemplateContextData, templateContent string) (string, error)

    }
}


"services.BinaryModelServiceContext""uses" o-- "services.BinaryModelLoaderInterface"
"services.BinaryModelServiceContext""uses" o-- "services.CodeCompilerInterface"
"services.BinaryModelServiceContext""uses" o-- "services.TemplateContextInterface"
"services.BinaryModelServiceContextData""uses" o-- "compiler.CompileContextData"
"services.BinaryModelServiceContextData""uses" o-- "model.BinaryModel"
"services.BinaryModelServiceContextData""uses" o-- "render.TemplateContextData"

namespace structures {
    class Dictionary << (S,Aquamarine) >> {
        + GetStringValue(key string) (string, error)
        + GetStringList(key string) ([]string, error)

    }
    class invalidValueTypeError << (S,Aquamarine) >> {
        + Value any

        + Error() string

    }
    class missingKeyError << (S,Aquamarine) >> {
        + Key string

        + Error() string

    }
    class structures.Dictionary << (T, #FF7700) >>  {
    }
}
"__builtin__.error" *-- "extends""structures.invalidValueTypeError"
"__builtin__.error" *-- "extends""structures.missingKeyError"


"structures.invalidValueTypeError""uses" o-- "structures.any"

"__builtin__.[]string" #.. "alias of""main.YamlFiles"
"__builtin__.int8" #.. "alias of""compiler.InsertPosition"
"__builtin__.string" #.. "alias of""main.Directory"
"__builtin__.string" #.. "alias of""main.VersionFlag"
"structures.<font color=blue>map</font>[string]<font color=blue>interface</font>{}" #.. "alias of""structures.Dictionary"
@enduml

3.4. Dependency diagram

bash-compiler dependency diagram
Source code: bash-compiler dependency diagram
@startuml

'!pragma layout elk

package main {
  [Main] ..> ParseCliInterface
  [Main] ..> BinaryModelServiceInterface
  ParseCliInterface )-- [ParseCli]  : <<implements>>
}

package services {
  BinaryModelServiceInterface )-- [BinaryModelService] : <<implements>>
  [BinaryModelService] ..> BinaryModelInterface : <<uses>>
  [BinaryModelService] ..> TemplateContextInterface : <<uses>>
  [BinaryModelService] ..> CompilerInterface : <<uses>>
}

package model {
  BinaryModelInterface )-- [BinaryModel] : <<implements>>
}

package render {
  TemplateContextInterface )-- [TemplateContext] : <<implements>>
}

package compiler {
  CompilerInterface )-- [Compiler] : <<implements>>
}

@enduml

3.5. directives and template

You can use special optional directives in src file

  • # FUNCTIONS mandatory directive
  • @embed directive
  • @require directive

Compile command allows to generate a binary file using some directives directly inside the src file.

Eg:

#!/usr/bin/env bash
# BIN_FILE=${FRAMEWORK_ROOT_DIR}/bin/binaryExample
# @embed "Backup::file" as backupFile
# @embed "${FRAMEWORK_ROOT_DIR}/bin/otherNeededBinary" AS "otherNeededBinary"
# FACADE

sudo "${embed_file_backupFile}" # ...
"${embed_file_otherNeededBinary}"
# ...

The above file header allows to generate the bin/binaryExample binary file. It uses @embed directive to allow the usage of Backup::file function as a binary, named backupFile that can even be called using sudo.

In previous example, the directive # FUNCTIONS is injected via the file cmd/bash-compiler/defaultTemplates/binFile.gtpl.

The srcFile should contains at least the directive BIN_FILE at top of the bash script file (see example above).

3.5.1. FUNCTIONS directive

It is the most important directive as it will inform the compiler where dependent framework functions will be injected in your resulting bash file.

3.5.2. Compiler - Compiler::Requirement::require

The compiler during successive passes:

  • use existing compiler passes (injectImportedFunctions)
    • will parse # @require directives of each newly injected functions
      • error if require name does not begin with require
      • error if require name does not comply naming convention
      • error if require* file not found
    • will ignore the disabled requirements
    • a tree of require dependencies will be computed
    • we inject gradually the framework functions linked to the requires functions
  • At the end of compiler processing
    • inject the requirements calls in the order specified by dependency tree (see below).
3.5.2.1. Requires dependencies tree

The following rules apply:

  • Some requirements can depends on each others, the compiler will compute which dependency should be loaded before the other. Eg: Log::requireLoad requirement depends on Framework::requireRootDir, so Framework::requireRootDir is loaded before. But Log requirement depends also on Env::requireLoad requirement.
  • Requirement can be set at namespace level by adding the directive in _.sh file or at function level.
  • A requirement can be loaded only once.
  • A requirement that is used by several functions will be more prioritized and will be loaded before a less prioritized requirement.
  • # FUNCTIONS placeholder should be defined before # REQUIREMENTS placeholder
  • # REQUIREMENTS placeholder should be defined before # ENTRYPOINT placeholder
3.5.2.2. Require example

The annotation @require added to a function like in this example:

# @require Env::requireLoad
# @require Log::requireLoad
Log::logMessage() {
  # rest of the function content
}

will do the following actions:

  • compiler checks that the required functions exist, if not an error is triggered.
  • compiler adds code to the required function that will set an environment variable to 1 when the function is called (eg: REQUIRE_FUNCTION_ENV_REQUIRE_LOAD_LOADED=1).
  • compiler adds code to the function that has these requirements in to check if these environment variables are set and exit 1 if not.
  • compiler checks if the function is called at least once but it is the developer’s responsibility to call the require function at the right place.

Code is generated using go templates. The go templates are configured in the yaml file at compiler config level.

compilerConfig:
  annotationsConfig:
    requireTemplate: require
    checkRequirementsTemplate: checkRequirements
# rest of the config file content

examples/templates/annotations/require.gtpl => generates this code:

Env::RequireLoad() {
  REQUIRE_FUNCTION_ENV_REQUIRE_LOAD_LOADED=1
  # rest of the function content
}

examples/templates/annotations/checkRequirements.gtpl => generates this code:

# @require Env::requireLoad
# @require Log::requireLoad
Log::logMessage() {
  if [[ "${REQUIRE_FUNCTION_ENV_REQUIRE_LOAD_LOADED:-0}" != 1 ]]; then
    echo >&2 "Requirement Env::requireLoad has not been loaded"
    exit 1
  fi

  if [[ "${REQUIRE_FUNCTION_LOG_REQUIRE_LOAD_LOADED:-0}" != 1 ]]; then
    echo >&2 "Requirement Log::requireLoad has not been loaded"
    exit 1
  fi
  # rest of the function content
}

The aims of a require are the following:

  • be to be able to test for a requirement just before executing a function that is marked with @require
  • when compiling be able to know if a function with a specific requirement has been used (eg: ubuntu>20)
  • There are several kind of requirements:
    • checking that a command is available
      • this requirement needs to be called at the proper level if the binary actually installs this command.
      • @require Aws::requireAwsCommand
      • @require Docker::requireDockerCommand
      • @require Git::requireGitCommand
      • @require Linux::requireCurlCommand
      • @require Linux::requireJqCommand
      • @require Linux::requireRealpathCommand
      • @require Linux::requirePathchkCommand
      • @require Linux::requireSudoCommand
      • @require Linux::requireTarCommand
      • @require Ssh::requireSshKeygenCommand
      • @require Ssh::requireSshKeyscanCommand
    • checking a feature is available
      • @require Git::requireShallowClone actually based on git version
    • checking a specific environment/state is available on execution
      • @require Linux::requireUbuntu
      • @require Linux::Wsl::requireWsl
      • @require Linux::requireExecutedAsUser
      • ubuntu>20
    • ensuring some specific loading are made
      • @require Env::requireLoad
      • @require Log::requireLoad
      • @require UI::requireTheme
3.5.2.3. Requires dependencies use cases

Script file example:

# FUNCTIONS placeholder
# REQUIRES placeholder
Linux::Apt::update || Log::displayError "impossible to update"
  • first compiler injectImportedFunctions pass
    • Linux::Apt::update requires
      • Linux::requireSudoCommand
      • Linux::requireUbuntu
    • Log::display* requires Colors::requireTheme
  • second compiler injectImportedFunctions pass
    • Log::log* requires Log::requireLoad
  • third compiler injectImportedFunctions pass
    • Log::requireLoad requires Env::requireLoad
  • fourth compiler injectImportedFunctions pass
    • Env::requireLoad requires
      • Framework::requireRootDir
      • Framework::tmpFileManagement (see src/_includes/_commonHeader.sh)
  • fifth compiler injectImportedFunctions pass
    • Framework::tmpFileManagement requires
      • Framework::requireRootDir which is already in the required list

If we order the requirements following reversed pass order, we end up with:

  • Framework::tmpFileManagement
  • Framework::requireRootDir
    • here we have an issue as it should come before Framework::tmpFileManagement
    • a solution could be to add the element to require list even if it is already in the list. This way it could even give a weight at certain requires.
  • Env::requireLoad
  • Log::requireLoad
  • Colors::requireTheme
  • Linux::requireUbuntu
  • Linux::requireSudoCommand

To take into consideration:

  • at each pass, we will parse the full list of functions and requires
    • it means the array of requires has to be reset at each pass.

Let’s take again our above example, pass by pass (we avoided to include some functions intentionally like Retry:default needed by Linux::Apt::update to make example easier to understand).

Pass #1: import functions Linux::Apt::update and Log::displayError

# @require Linux::requireSudoCommand
# @require Linux::requireUbuntu
Linux::Apt::update() { :; }
# @require Log::requireLoad
Log::displayError() {
  #...
  Log:logMessage #...
}
# FUNCTIONS placeholder
# we don't have any yet as we are still parsing the 3 lines
# code above.
# REQUIRES placeholder
Linux::Apt::update || Log::displayError "impossible to update"

Functions imported list so far:

  • Linux::Apt::update
  • Log::displayError

Pass #2: import functions Log:logMessage and import required functions in reverse order Linux::requireSudoCommand, Linux::requireUbuntu, Log::requireLoad Note: remember that require functions are only filtered using # @require

# @require Linux::requireSudoCommand
# @require Linux::requireUbuntu
Linux::Apt::update() { :; }
# @require Log::requireLoad
Log::displayError() {
  #...
  Log:logMessage #...
}
Log:logMessage() { :; }
Linux::requireSudoCommand() { :; }
Linux::requireUbuntu() { :; }
# @require Env::requireLoad
Log::requireLoad() { :; }
# FUNCTIONS placeholder

Log::requireLoad
Linux::requireUbuntu
Linux::requireSudoCommand
# REQUIRES placeholder
Linux::Apt::update || Log::displayError "impossible to update"

Functions imported list so far:

  • Linux::Apt::update
  • Log::displayError
  • Log:logMessage
  • Log::requireLoad
  • Linux::requireSudoCommand
  • Linux::requireUbuntu

Pass #3: import functions, import required functions will import Env::requireLoad so order of requires will be:

Env:requireLoad
Log::requireLoad
Linux::requireUbuntu
Linux::requireSudoCommand

3.5.3. @embed directive (optional)

Allows to embed files, directories or a framework function. The following syntax can be used:

Syntax: # @embed "srcFile" AS "targetFile"

Syntax: # @embed "srcDir" AS "targetDir"

if @embed annotation is provided, the file/dir provided will be added inside the resulting bin file as a tar gz file(base64 encoded) and automatically extracted when executed.

The @embed annotation allows to embed as base64 encoded a file or a directory. annotationEmbed allows to:

  • include a file(binary or not) as base64 encoded, the file can then be extracted using the automatically generated method Compiler::Embed::extractFile_asName where asName is the name chosen using annotation explained above. The original file mode will be restored after extraction. The variable embed_file_asName contains the targeted filepath.
  • include a directory, the directory will be tar gz and added to the compiled file as base64 encoded string. The directory can then be extracted using the automatically generated method Compiler::Embed::extractDir_asName where asName is the name chosen using annotation explained above. The variable embed_dir_asName contains the targeted directory path.
  • include a bash framework function, a special binary file that simply calls this function will be automatically generated. This binary file will be added to the compiled file as base64 encoded string. Then it will be automatically extracted to temporary directory and is callable directly using asName chosen above because path of the temporary directory has been added into the PATH variable.

The syntax is the following:

# @embed "${FRAMEWORK_ROOT_DIR}/README.md" as readme
# @embed "${FRAMEWORK_ROOT_DIR}/.cspell" as cspell

This will generate the code below:

Compiler::Embed::extractFileFromBase64 \
  "${PERSISTENT_TMPDIR:-/tmp}/1e26600f34bdaf348803250aa87f4924/readme" \
  "base64 encode string" \
  "644"

declare -gx embed_file_readme="${PERSISTENT_TMPDIR:-/tmp}/1e26600f34bdaf348803250aa87f4924/readme"

Compiler::Embed::extractDirFromBase64 \
  "${PERSISTENT_TMPDIR:-/tmp}/5c12a039d61ab2c98111e5353362f380/cspell" \
  "base64 encode string"

declare -gx embed_dir_cspell="${PERSISTENT_TMPDIR:-/tmp}/5c12a039d61ab2c98111e5353362f380/cspell"

The embedded files will be automatically uncompressed.

activity diagram to explain how EMBED directives are injected
Source code: Activity diagram
@startuml "compilerEmbedInjection"
title compiler embed injection
skinparam {
  ' https://github.com/plantuml/plantuml/blob/49115dfc7d4156961e5b49a81c09b474daa79823/src/net/sourceforge/plantuml/style/FromSkinparamToStyle.java#L145
  activityDiamondBackgroundColor #AAAAAA
  activityEndColor #red
}
start

:compiler;

#SpringGreen:Compiler::Embed::inject fromFile;

#SpringGreen:Compiler::Embed::filter fromFile;
while (embedDirective?) is (<color:green>embed found)
  -[#green]->
  partition Compiler::Embed::embed #SpringGreen {
    #SpringGreen:Compiler::Embed::parse $embedDirective
      ~~uses Compiler::Embed::assertAsName~~
      ~~uses Compiler::Embed::assertResource~~
    ;
    if (embed directive can be parsed?) is (<color:green>valid directive) then
      -[#green]->
    else (<color:red>invalid embed\n<color:red>directive)
      -[#red]->
      end
    endif
    -[#green]->
  }

  if (embed name already injected?) then (yes)
    :display skip;
  else (<color:green>embed resource can be injected)
    if (embed resources already injected?) then (yes)
      :display warning;
    endif
    partition Compiler::Embed::embed #LightGray {
      #SpringGreen:__Compiler::Embed::embed__ **$resource** **$asName**;
      switch (resource type?)
      case ( resource is a file )
        #SpringGreen:Compiler::Embed::embedFile;
      case ( resource is a directory )
        #SpringGreen:Compiler::Embed::embedDir;
      case ( resource is a bash\nframework function )
        #SpringGreen:Compiler::Embed::embedFrameworkFunction;
        #SpringGreen:bin/compile;
      case ( <color:red>error )
        -[#red]->
        end
      endswitch
    }
    if (embed resource injection failure ?) then
      -[#red]-> <color:red>display error\n<color:red>and return 2;
      end
    else
      -[#green]->
    endif

  endif
  -[#green]-> <color:green>process next embed directive;

endwhile (no more embed directive to process)

:Output result;

stop
@enduml

See compiler - Compiler::Embed::embed below for more information.

4. Best practices

@embed keyword is really useful to inline configuration files. However to run framework function using sudo, it is recommended to call the same binary but passing options to change the behavior. This way the content of the script file does not seem to be obfuscated.

5. Acknowledgements

I want to thank a lot Michał Zieliński(Tratif company) for this wonderful article that helped me a lot in the conception of the file/dir/framework function embedding feature.

for more information see Bash Tips #6 – Embedding Files In A Single Bash Script

Backup page of the above article

2 - Technical architecture

Technical architecture of bash-compiler

1. Go Libraries used

2. Template system

template system doc 1

There is the choice between Go template/text or template/html libraries. I chosen template/text to avoid some escaping that are not needed in bash.

Go template/text or template/html don’t provide any execution context to the filters (FuncMap).

I’m not using Template.ParseGlob because I have to call it twice to include files of root directory and sub directories with 2 glob patterns. But a bug in text/template makes the template be initialized again after each calls to ParseGlob function. So I compute manually list of templates in internal/render/render.go NewTemplate function.

I simulated a context by pushing the context to the render function. So the data associated to the template has the following structure:

type Context struct {
 Template *template.Template
 Name     string
 RootData any
 Data     any
}
  • Template points to the first template that has been rendered
  • Name is the name of the first template that has been rendered
  • RootData are the data that have been sent at the start of the rendering
  • Data are the data sent to the sub template (possibly a part of RootData or the whole RootData)

Then each filter has to be called with the right context. The special filter include allows to include a sub template overriding context Data.

Template filter functions, internal/render/functions/index.go includes:

  • Sprig filter functions
    • Sprig is not maintained anymore, a possible alternate fork is sprout but it misses a lot of functions.
  • my own templates functions
    • string functions
      • stringLength
      • format allow to format string like in this example
        • {{ format "${%sLongDescription[@]}" .functionName }}
    • templates functions
      • include: allows to include a template by template name allowing to use filter
      • includeFile: allows to include a template by filename
      • includeFileAsTemplate: same as includeFile but interpreting the file as a template
      • dynamicFile: resolve first matching filepath in paths provided as argument

3. Compiler

see Compile command.

3 - Development

Guidelines for developing and contributing to bash-compiler

1. Requirements

1.1. Go Version

This project requires Go 1.25.7 or later.

To check your current Go version:

go version

To install or upgrade Go, visit golang.org/dl.

1.2. Pre-commit Hook

This repository uses pre-commit software to ensure every commit respects a set of rules specified by the .pre-commit-config.yaml file. It requires pre-commit to be installed.

Enable pre-commit hooks with:

pre-commit install --hook-type pre-commit --hook-type pre-push

Now linters and compilation tools will run automatically on commit and push.

1.3. Pre-commit dependencies

You need to install some dependencies for the pre-commit hooks to work properly. You can do it with:

.github/scripts/install-dev.sh

2. Build/Run/Clean

Formatting is managed exclusively by pre-commit hooks.

2.1. Build

Build with Docker:

.github/scripts/build-docker.sh

Build locally (requires Go 1.25.7):

.github/scripts/build-local.sh

2.2. Tests

Run all tests with race detector:

.github/scripts/test.sh

Run tests for specific package:

go test -v -race ./internal/compiler/...

2.3. Coverage

Generate coverage report:

.github/scripts/coverage.sh

Coverage reports are generated in logs/coverage.log.

2.4. Run the Binary

.github/scripts/run.sh

2.5. Clean

.github/scripts/clean.sh

3. Dependencies Management

3.1. Updating Go Dependencies

The project uses Go modules for dependency management. To update dependencies:

3.1.1. Update All Dependencies

go get -u ./...

Then tidy and verify:

go mod tidy
go mod verify

3.1.2. Update Specific Package

go get -u github.com/example/package

3.1.3. Downgrade a Package

go get github.com/example/package@v1.0.0

3.1.4. Upgrading Go Version

  1. Check current version:

    go version
    
  2. Update go.mod:

    go get -u golang.org/x/net golang.org/x/crypto golang.org/x/sys golang.org/x/text
    go get -u ./...
    
  3. Update go.mod with new version:

    # Edit go.mod manually or use:
    grep -n "^go " go.mod
    # Then update the version number and toolchain directive
    

    Example: changing from 1.24 to 1.25.7:

    module github.com/fchastanet/bash-compiler
    
    go 1.25.7
    
    toolchain go1.25.7
    
  4. Tidy and verify:

    go mod tidy
    go mod verify
    
  5. Run tests:

    go test ./... -race
    
  6. Build:

    go build ./cmd/bash-compiler
    

4. Manual Compilation Commands

4.1. Compile Binary with Config

go run ./cmd/bash-compiler examples/configReference/shellcheckLint.yaml \
  --root-dir /path/to/bash-tools-framework \
  -t examples/generated -k -d

Flags:

  • -t, --intermediate-files-dir: Output directory for generated files
  • -k: Keep intermediate files
  • -d, --debug: Enable debug logging

4.2. Transform and Validate YAML with CUE

cue export \
  -l input: examples/generated/shellcheckLint-merged.yaml \
  internal/model/binFile.cue --out yaml \
  -e output >examples/generated/shellcheckLint-cue-transformed.yaml

5. KCL Configuration Language

The project uses KCL for configuration validation. See KCL Documentation.

5.1. Test KCL Files

cd internal/model/kcl
kcl -D configFile=testsKcl/bad-example.yaml

6. Project Structure

Key directories:

  • .github/scripts/ - Build and test scripts
  • cmd/bash-compiler/ - Main entry point and CLI
  • internal/compiler/ - Core compilation logic
  • internal/model/ - Data structures and YAML models
  • internal/render/ - Template rendering engine
  • internal/utils/ - Utility packages
  • examples/configReference/ - Reference YAML configurations
  • content/docs/ - Documentation files

7. Common Workflows

7.1. Adding a New Dependency

go get github.com/example/package
go mod tidy
go test ./...

7.2. Running Specific Tests

# Table-driven test specific case
go test -v -run TestFunctionName/caseName ./internal/package

# All tests in package with coverage
go test -v -cover ./internal/package

7.3. Debugging

Enable debug logging in the compiler:

bash-compiler yaml-file -d

Check intermediate files:

bash-compiler yaml-file -t /tmp/debug -k
ls -la /tmp/debug/

7.4. Code Style

  • Indentation: Tabs for Go files (enforced by .editorconfig)
  • Formatting: Handled by pre-commit hooks (gofmt)
  • Linting: Multiple linters via MegaLinter

To run linters manually:

pre-commit run --all-files

8. Testing

8.1. Test Organization

Tests use:

  • github.com/stretchr/testify for assertions
  • gotest.tools/v3 for advanced utilities
  • Table-driven test pattern (standard)

Example test pattern:

func TestFunction(t *testing.T) {
    tests := []struct {
        name    string
        input   string
        want    string
        wantErr bool
    }{
        {name: "case 1", input: "x", want: "y"},
        {name: "case 2", input: "foo", wantErr: true},
    }
    for _, tt := range tests {
        t.Run(tt.name, func(t *testing.T) {
            got, err := Function(tt.input)
            if tt.wantErr {
                assert.Error(t, err)
                return
            }
            assert.NoError(t, err)
            assert.Equal(t, tt.want, got)
        })
    }
}

8.2. Test Coverage

Minimum coverage recommendations:

  • Warning: 60% of statements
  • Good: 80% or higher

View coverage:

go test ./... -cover

Generate detailed report:

go test ./... -coverprofile=coverage.out
go tool cover -html=coverage.out

9. CI/CD

GitHub Actions workflows:

  • main.yml - Build Docker images, run MegaLinter, and tests
  • Runs on push to master, pull requests, and manual dispatch
  • Excludes changes to docs/**

Check workflow logs in: .github/workflows/main.yml

10. Troubleshooting

10.1. Build Fails

# Check Go version
go version

# Clean and rebuild
.github/scripts/clean.sh
.github/scripts/build-local.sh

# Check module cache
go clean -modcache
go mod tidy

10.2. Tests Fail

# Run with verbose output
go test -v -race ./...

# Run specific test
go test -v -run TestName ./package

10.3. Pre-commit Hooks Fail

# Run hooks manually
pre-commit run --all-files

# Run specific hook
pre-commit run hook-id --all-files

Some hooks auto-fix - stage changes and retry.

10.4. Module Issues

# Verify module integrity
go mod verify

# Download and check all modules
go mod download

11. Release Checklist

Before releasing:

  1. ✅ All tests pass: go test ./... -race
  2. ✅ Coverage acceptable: .github/scripts/coverage.sh
  3. ✅ Pre-commit passes: pre-commit run --all-files
  4. ✅ Documentation updated
  5. ✅ Dependencies audit: go mod verify
  6. ✅ Build successful: go build ./cmd/bash-compiler
  7. ✅ Commit message follows guidelines (see commit-msg-template.md)

12. Additional Resources

4 - Backup Pages

Archive of external reference material preserved for long-term access

This section contains backup archives of external reference material. These pages are preserved to ensure continued access to valuable information if the original sources become unavailable.

All backup pages are protected from search engine indexing to respect original content creators’ intellectual property rights.

1. Backup Pages Currently Archived

The following pages have been archived:

Articles in this section

TitleDescriptionUpdated
Bash Tips 6 – Embedding Files in a Single Bash ScriptTechniques for embedding files into a single bash script using base64 encoding and process substitution2026-02-27

2. About Backup Pages

Backup pages are archived copies of valuable technical content from external sources. They serve as a reference library in case the original website becomes unavailable or moves.

2.1. Why Backup Pages?

  • Preservation: Valuable content doesn’t disappear
  • Attribution: Original authors are properly credited
  • Access: Works even if original source goes down
  • Reference: Build a comprehensive knowledge base
  • Fair Use: Protected through proper archival practices

2.2. How It Works

Each backup page:

  1. Clearly indicates it’s a backup with a prominent notice
  2. 🔒 Blocks search engine indexing via robots: noindex meta tag
  3. 🔗 Links to the original source for proper attribution
  4. 👤 Credits the original author with a direct link when available
  5. 📅 Shows the original publication date and backup date

Backup pages use fair use for archival purposes:

  • Original author is credited with links
  • Search engines are blocked from indexing
  • Original source links are prominent
  • Content is preserved without modification
  • Purpose is archival, not competition

3. Search Engine Protection

All backup pages include:

<meta content="noindex, noarchive, nocache, nofollow, nosnippet, notranslate, noimageindex" name="robots"/>

This ensures:

  • 🛑 any Search engine won’t index the page
  • 🛑 Internet Archive won’t create snapshots
  • ✅ Respects original content creators’ rights
  • ✅ Maintains fair use for archival purposes

4. Backup Notice

Every backup page displays a prominent notice showing:

📦 This is a Backup Copy

This page is a backup archive of external reference material.
It is preserved to ensure continued access to valuable information
in case the original source becomes unavailable.

Original Source: [link to original]
Original Author: [author with link if available]
Original Date: [publication date]
Backed Up On: [backup date]

⚠️ Search engines are blocked from indexing this page
   to respect original content creators' rights.

5. FAQ

Q: Is this legal? A: Yes, when done properly with attribution and search engine blocking, archival falls under fair use.

Q: Will my backup appear in Google? A: No, the noindex meta tag prevents search engine indexing.

Q: Do I need to ask permission? A: Fair use principles and proper attribution usually cover this, but check the original license.

Q: Can I modify the content? A: Preserve original content as-is. Note modifications clearly if needed.

Q: What if the original disappears? A: That’s the purpose of backup pages—to preserve valuable information.

Q: How often should I update backups? A: Archive doesn’t change. Update only if original changes significantly and you want to mirror the update.

4.1 - Bash Tips 6 – Embedding Files in a Single Bash Script

Techniques for embedding files into a single bash script using base64 encoding and process substitution

Scripts that utilize multiple files are not easy to distribute. We usually distribute those as archives and rely on the end user to unpack and run them from a predetermined location. To improve the experience we can instead prepare a single script with other files embedded inside it.

Here are the goals:

  1. The script should consist of a single file, making it easy to distribute.
  2. The script should be copy-paste-able between systems and different editors, even if multiple hops are required.
  3. Files being embedded can be binary files i.e. can contain non-printable characters.
  4. The first requirement implies that we should somehow store the contents of other files in our main script. The second requires us to avoid non-printable characters, as they tend to cause problems when performing a copy-paste operation. Especially when we are talking about sending such characters over messaging programs.

Encoding

The solution to the second and third problems is a binary-to-text encoding which encodes an array of bytes into a text constant of printable characters. And the most commonly used encoding scheme is base64. Utils to encode to and from base64 are included in most Linux distributions out-of-the-box.

Let’s transform a file, logging setup script, into base64 encoded text:

base64 -w 0 includes/logging.sh

prints

TE9HRklMRT0iJHsxOi1zY3JpcHQubG9nfSIKZXhlYyAzPiYxIDE+IiRMT0dGSUxFIiAyPiYxCnRyYXAgImVjaG8gJ0VSUk9SOiBBbiBlcnJvciBvY2N1cnJlZCBkdXJpbmcgZXhlY3V0aW9uLCBjaGVjayBsb2cgJExPR0ZJTEUgZm9yIGRldGFpbHMuJyA+JjMiIEVSUgp0cmFwICd7IHNldCAreDsgfSAyPi9kZXYvbnVsbDsgZWNobyAtbiAiWyQoZGF0ZSAtSXMpXSAgIjsgc2V0IC14JyBERUJVRwo=

which, when decoded

base64 -d <<<TE9HRklMRT0iJHsxOi1zY3JpcHQubG9nfSIKZXhlYyAzPiYxIDE+IiRMT0dGSUxFIiAyPiYxCnRyYXAgImVjaG8gJ0VSUk9SOiBBbiBlcnJvciBvY2N1cnJlZCBkdXJpbmcgZXhlY3V0aW9uLCBjaGVjayBsb2cgJExPR0ZJTEUgZm9yIGRldGFpbHMuJyA+JjMiIEVSUgp0cmFwICd7IHNldCAreDsgfSAyPi9kZXYvbnVsbDsgZWNobyAtbiAiWyQoZGF0ZSAtSXMpXSAgIjsgc2V0IC14JyBERUJVRwo=

prints the contents of the file:

LOG_FILE="${1:-script.log}"
exec 3>&1 1>"$LOG_FILE" 2>&1
trap "echo 'ERROR: An error occurred during execution, check log $LOG_FILE for details.' >&3" ERR
trap '{ set +x; } 2>/dev/null; echo -n "[$(date -Is)]  "; set -x' DEBUG

this can be redirected to a file, that can later be used:

base64 -d >/tmp/logging.sh <<<TE9HRklMRT0iJHsxOi1zY3JpcHQubG9nfSIKZXhlYyAzPiYxIDE+IiRMT0dGSUxFIiAyPiYxCnRyYXAgImVjaG8gJ0VSUk9SOiBBbiBlcnJvciBvY2N1cnJlZCBkdXJpbmcgZXhlY3V0aW9uLCBjaGVjayBsb2cgJExPR0ZJTEUgZm9yIGRldGFpbHMuJyA+JjMiIEVSUgp0cmFwICd7IHNldCAreDsgfSAyPi9kZXYvbnVsbDsgZWNobyAtbiAiWyQoZGF0ZSAtSXMpXSAgIjsgc2V0IC14JyBERUJVRwo=

Using base64 allows us to store binary files as an easy to work with text. I have used exactly this mechanism to prepare a script that would store and activate binary licenses for an external proprietary system that we have used in one of our projects.

It is worth noting that in the case of shell scripts, base64 encoding provides a safety layer preventing us from accidental execution. If we were to use a here-document to try to achieve the same functionality would have to account for variable expansion:

cat >/tmp/logging.sh <<EOC
LOG_FILE="${1:-script.log}"
exec 3>&1 1>"$LOG_FILE" 2>&1
trap "echo 'ERROR: An error occurred during execution, check log $LOG_FILE for details.' >&3" ERR
trap '{ set +x; } 2>/dev/null; echo -n "[$(date -Is)]  "; set -x' DEBUG
EOC

This code does not work, the resulting file has variables expanded, as all variables inside here-documents are expanded:

(contents of /tmp/logging.sh file created by running the command above)

LOG_FILE="script.log"
exec 3>&1 1>"" 2>&1
trap "echo 'ERROR: An error occurred during execution, check log  for details.' >&3" ERR
trap '{ set +x; } 2>/dev/null; echo -n "[2023-01-24T13:36:23+01:00]  "; set -x' DEBUG

Utilizing process substitution

We do not have to create a temporary file that we later have to clean up. If a file is only to be read once, we can utilize process substitution. It allows referencing an output (or input) of a process as a file that can be accessed. Let’s see that in an example:

base64 -d <<<SGVsbG8gV29ybGQhCg==

this command prints ‘Hello World!’ on standard output, nothing special.

<(base64 -d <<<SGVsbG8gV29ybGQhCg==)

this returns a path to a file, that, when read, returns the output of the command inside the <(...).

echo <(base64 -d <<<SGVsbG8gV29ybGQhCg==)

prints

/proc/self/fd/11

a file path. This is a file created thanks to using process substitution. If we read the file we get the output of our base64 which contains decoded message. Let’s replace the echo in the call with cat to see the file contents:

cat <(base64 -d <<<SGVsbG8gV29ybGQhCg==)

<(base64 -d <<<SGVsbG8gV29ybGQhCg==) gets transformed into a file path, and cat performs read on this file and prints its contents on the console:

Hello World!

Quite a sophisticated way to print a simple message. With the mechanism explained, let’s proceed with utilizing it in a script:

source <(base64 -d <<<TE9HRklMRT0iJHsxOi1zY3JpcHQubG9nfSIKZXhlYyAzPiYxIDE+IiRMT0dGSUxFIiAyPiYxCnRyYXAgImVjaG8gJ0VSUk9SOiBBbiBlcnJvciBvY2N1cnJlZCBkdXJpbmcgZXhlY3V0aW9uLCBjaGVjayBsb2cgJExPR0ZJTEUgZm9yIGRldGFpbHMuJyA+JjMiIEVSUgp0cmFwICd7IHNldCAreDsgfSAyPi9kZXYvbnVsbDsgZWNobyAtbiAiWyQoZGF0ZSAtSXMpXSAgIjsgc2V0IC14JyBERUJVRwo=)

This is a base64 encoded logging script from the previous section, which we can easily source without creating a temporary file.

Embedding entire directory tree

With the tools we have at our disposal presented in previous sections, let’s try to create a single portable script from a complex multi-file setup. We will use the code presented in a previous article of mine on templating.

First, let’s change our working directory to the one that contains the main script script-3.1.sh and create a compressed archive of the contents of the entire project using tar and gunzip, which we base64 encode:

cd ... # path to the project

# .
# ├── includes
# │   ├── gatheringFacts.sh
# │   ├── logging.sh
# │   └── templating.sh
# ├── script-3.1.sh
# ├── templates
# │   └── config.yml
# ├── utils
# │   └── getIp.sh

tar -cz -O . | base64 -w 0

Here is our encoded archive:

H4sIAAAAAAAAA+1Z+2/iRhDmZ0v5H+Zc98C64iePiJRc6R25IuWBOKhUJShy7AVWMba1a0LSHP97Z21eIdfSVoGoV39IRp79dmbx7IxnFk3P7RwGoloui2+zWjbWvxfImSW7YlfLlmGZOcPEayUH5d0vLZeb8NhhALkbEofBX/C2jf9HoemTmPp8p7vg7/u/VClbNvrftKtG5v99YOH/IYlbkcZHu7Cxzf8V217EfwV9j/63qlUrB8YuFrOJ/7n/v3uj39BAv3H4SJpwZ0gKKjxKgCDuKAS5zWgQcygMKOOxCq02OJ7HCOfgcB661ImJB1MajyAg8TRkt4B8wgaOS+COOjAdUXeUypAAMXMGA+pCHAKN4FBLPkA5sHCCmjRZmknSUkNdKSArGQLcoEv+FxgyEkGxHUK+8P7HukfuQL2aviu8r4OaV6X5EtPp4gb4KJyCoCmr5a1pKY7BTFVRsUr1ynt3pT25oGpdqJboAC4vofg7yMrcjAz9/hHEIxKsPbdmp3PRqcGJQ318QPhzxfqRg09QhuO3Vkq9pzGY0oBKUjKrGKxplfbif03nLqNRXLQ1c0fhvzX+bXMz/1t2pZTF/z6wHv9Su9H9pa6Ia00pRFNPTV8OEg8nDAOGBq4/8QjX/XA4pMEQ9wvI6faxNZTJz4hDB7c8ZpDhiePGHPnPGDEZR74Tp9okaX5LQD5pfOh+lmEh4LobBgM61B7GPujxONLHDyuJtJ9g+Qah6asnvCsb/7z+sywrq//3gnX/r+LpZW0kNV6p9Kf+t4zqhv9LllnO8v8+wAm7I6x2kNQj89KjBsrnZufXZue61U4HopDFNTg0Do0D6UCiHgliOqA4b8X82Dzvtk5azc6B5EYT6q2GPrR712eN857I6L1Owj3AWjOZ/dhD0uwgS9+vBU1fvop3ZuNf5H/DzvL/XrDm/1VV98I2tuV/u2yu+v9KVeR/A7dBlv/3gNOLTyet02ZdVh7NWhG3gCjkZ7JE7okL9vFbE8xjWZmzZLBQImETH4GcdKz5eZ/bCIAwFjIIXXfCGLa83kTU/SD0YBMRBj+AOyLuLaB6UAqMOH6EzQEsVKswwNkeibFh5loeW2RbBlSeGss/Asf++d39EcxwDTp28now8f0jWPbNl0rBE41DscXVPoB8lMwo3ufhY/Pn3qfsDfN1rMX/s2btpWxsrf8qm/Vf2baNLP73AY+4vsMwbBqQdNxScr3MLwvAfL+uFBanw+rm8LLqS2ijkMeBMyabtK9UgKna5PxNnL3dkcAL2TX1QI9Y6OqihAwGIXwBzB5Q9KAGxQFYeH/vsCFXX/uxfTNYi/8nRzEvaWNb/Jdsa7P/K9tZ/7cXLLp/cewPSa/X6HQav12fN85EUWDKiazbPGufNrrN6+SEUFasVHzR67Z73YXQlqVE6oeu44t3cqIJB1Ya02niTX9LHoAG2AC+SUYvf+rPjsALk/H0aFz0nKAgr648phxx058lFC8MSGrNdWKQlScLlDFNkOCOT264GCtE4j+MAeSV7znkUbBmU1YFOSYExWu/RvwN8dqeyZAhQ4YMGTJk2A3+APiee2EAKAAA

We now can create a wrapper script, that will create a temporary directory, change the working directory to it and extract the files there. Then it would run the main script.

#!/bin/bash
cd $(mktemp -d)
tar -xzf <(base64 -d <<<H4sIAAAAAAAAA+1Z+2/iRhDmZ0v5H+Zc98C64iePiJRc6R25IuWBOKhUJShy7AVWMba1a0LSHP97Z21eIdfSVoGoV39IRp79dmbx7IxnFk3P7RwGoloui2+zWjbWvxfImSW7YlfLlmGZOcPEayUH5d0vLZeb8NhhALkbEofBX/C2jf9HoemTmPp8p7vg7/u/VClbNvrftKtG5v99YOH/IYlbkcZHu7Cxzf8V217EfwV9j/63qlUrB8YuFrOJ/7n/v3uj39BAv3H4SJpwZ0gKKjxKgCDuKAS5zWgQcygMKOOxCq02OJ7HCOfgcB661ImJB1MajyAg8TRkt4B8wgaOS+COOjAdUXeUypAAMXMGA+pCHAKN4FBLPkA5sHCCmjRZmknSUkNdKSArGQLcoEv+FxgyEkGxHUK+8P7HukfuQL2aviu8r4OaV6X5EtPp4gb4KJyCoCmr5a1pKY7BTFVRsUr1ynt3pT25oGpdqJboAC4vofg7yMrcjAz9/hHEIxKsPbdmp3PRqcGJQ318QPhzxfqRg09QhuO3Vkq9pzGY0oBKUjKrGKxplfbif03nLqNRXLQ1c0fhvzX+bXMz/1t2pZTF/z6wHv9Su9H9pa6Ia00pRFNPTV8OEg8nDAOGBq4/8QjX/XA4pMEQ9wvI6faxNZTJz4hDB7c8ZpDhiePGHPnPGDEZR74Tp9okaX5LQD5pfOh+lmEh4LobBgM61B7GPujxONLHDyuJtJ9g+Qah6asnvCsb/7z+sywrq//3gnX/r+LpZW0kNV6p9Kf+t4zqhv9LllnO8v8+wAm7I6x2kNQj89KjBsrnZufXZue61U4HopDFNTg0Do0D6UCiHgliOqA4b8X82Dzvtk5azc6B5EYT6q2GPrR712eN857I6L1Owj3AWjOZ/dhD0uwgS9+vBU1fvop3ZuNf5H/DzvL/XrDm/1VV98I2tuV/u2yu+v9KVeR/A7dBlv/3gNOLTyet02ZdVh7NWhG3gCjkZ7JE7okL9vFbE8xjWZmzZLBQImETH4GcdKz5eZ/bCIAwFjIIXXfCGLa83kTU/SD0YBMRBj+AOyLuLaB6UAqMOH6EzQEsVKswwNkeibFh5loeW2RbBlSeGss/Asf++d39EcxwDTp28now8f0jWPbNl0rBE41DscXVPoB8lMwo3ufhY/Pn3qfsDfN1rMX/s2btpWxsrf8qm/Vf2baNLP73AY+4vsMwbBqQdNxScr3MLwvAfL+uFBanw+rm8LLqS2ijkMeBMyabtK9UgKna5PxNnL3dkcAL2TX1QI9Y6OqihAwGIXwBzB5Q9KAGxQFYeH/vsCFXX/uxfTNYi/8nRzEvaWNb/Jdsa7P/K9tZ/7cXLLp/cewPSa/X6HQav12fN85EUWDKiazbPGufNrrN6+SEUFasVHzR67Z73YXQlqVE6oeu44t3cqIJB1Ya02niTX9LHoAG2AC+SUYvf+rPjsALk/H0aFz0nKAgr648phxx058lFC8MSGrNdWKQlScLlDFNkOCOT264GCtE4j+MAeSV7znkUbBmU1YFOSYExWu/RvwN8dqeyZAhQ4YMGTJk2A3+APiee2EAKAAA)
./script-3.1.sh

Summary

Using the techniques I have shown in this article we achieved the goals stated at the beginning of the article:

The script should consist of a single file, making it easy to distribute The script should be copy-paste-able between systems and different editors, even if connecting via multiple jumps Files being embedded can be binary files i.e. can contain non-printable characters A complex multi-file directory structure has been transformed into a copy-paste-able, few lines long, script. Of course, this practice should be used when necessary and should be avoided if possible. The contents, hidden behind encoding and compression, are completely obfuscated. The user has no idea what such script does and has to reverse engineer the process to find out. I have personally used this when working with certain client servers that were reachable via multiple jump hosts and had no internet access, where copy-pasting a single script was a big time saver over transferring the files.