• Products
    • Observe

      Keep tabs on your world in real-time, reducing a cacophony of activity from massive amounts of raw content into orderly, easy to consume data

    • Orient

      Our low-code, composable, distributed, and event-driven predictive analytics workflow engine mines for actionable insights hidden in data at any scale

    • Dominate

      This insight interaction and decision support application takes you beyond self-service, visual exploration of piles of insights & provides auto-ML powered actionable, scenario-specific advice to achieve the goals you specify

  • Markets
    • Government

      At BigBear.ai, we take pride in providing mission-critical services and solutions to our government customers. This includes Data Analytics, Systems Engineering, and Cyber services.

    • Commercial Markets

      BigBear.ai provides innovative AI solutions to companies across commercial markets, ranging from Space to Media to Shipping and Transportation.

  • Company
    Just Announced
    Press Release
    BigBear.ai Names Former Intelligence Officer Tony Barrett as President of Cyber and Engineering Sector
    • Col – 1
      • About

        We help governments and businesses make the decisions that change markets and define outcomes

      • Investor Relations

        We operationalize artificial intelligence and machine learning at scale through our end-to-end platform

      • Partners

        We work with our technology partners to customize products for the private and public sectors

    • Col – 2
      • Team

        Our executive team brings decades of world-class experience

      • Newsroom

        News articles from our press room

  • Careers
    Apply Now
    AWS Principal Architect
    Location:

    Columbia, Maryland

    Description:

    Our AWS CCoE is a collaborative, diverse, and highly-capable team which is key to Company growth. We are seeking an AWS Cloud Architect to be a key member of our AWS Cloud Center of Excellence. This position will be based out of Columbia, MD and is fully remote.

    Category: Engineering

    • Col – 1
      • Explore Jobs
        • Returning Applicant Login
        • Current Employee Login
      • Culture
      • Benefits
      • Military and Veterans
      • Product and Technology
      • Join Our Talent Community

        Sign up to receive personalized alerts and stay up to date on job openings right for you

  • Resources
    Featured
    Video
    Panel Discussion: AI/ML Applications to Support DHS
    • Col – 1
      • Blog

        See what’s new in artificial intelligence, machine learning, and data analytics

      • Resource Library

        Browse our resource library and discover more about our products and solutions

      • Newsroom

        News articles from our press room

  • Contact
BigBear.ai
  • Products
    • Observe

      Keep tabs on your world in real-time, reducing a cacophony of activity from massive amounts of raw content into orderly, easy to consume data

    • Orient

      Our low-code, composable, distributed, and event-driven predictive analytics workflow engine mines for actionable insights hidden in data at any scale

    • Dominate

      This insight interaction and decision support application takes you beyond self-service, visual exploration of piles of insights & provides auto-ML powered actionable, scenario-specific advice to achieve the goals you specify

  • Markets
    • Government

      At BigBear.ai, we take pride in providing mission-critical services and solutions to our government customers. This includes Data Analytics, Systems Engineering, and Cyber services.

    • Commercial Markets

      BigBear.ai provides innovative AI solutions to companies across commercial markets, ranging from Space to Media to Shipping and Transportation.

  • Company
    Just Announced
    Press Release
    BigBear.ai Names Former Intelligence Officer Tony Barrett as President of Cyber and Engineering Sector
    • Col – 1
      • About

        We help governments and businesses make the decisions that change markets and define outcomes

      • Investor Relations

        We operationalize artificial intelligence and machine learning at scale through our end-to-end platform

      • Partners

        We work with our technology partners to customize products for the private and public sectors

    • Col – 2
      • Team

        Our executive team brings decades of world-class experience

      • Newsroom

        News articles from our press room

  • Careers
    Apply Now
    AWS Principal Architect
    Location:

    Columbia, Maryland

    Description:

    Our AWS CCoE is a collaborative, diverse, and highly-capable team which is key to Company growth. We are seeking an AWS Cloud Architect to be a key member of our AWS Cloud Center of Excellence. This position will be based out of Columbia, MD and is fully remote.

    Category: Engineering

    • Col – 1
      • Explore Jobs
        • Returning Applicant Login
        • Current Employee Login
      • Culture
      • Benefits
      • Military and Veterans
      • Product and Technology
      • Join Our Talent Community

        Sign up to receive personalized alerts and stay up to date on job openings right for you

  • Resources
    Featured
    Video
    Panel Discussion: AI/ML Applications to Support DHS
    • Col – 1
      • Blog

        See what’s new in artificial intelligence, machine learning, and data analytics

      • Resource Library

        Browse our resource library and discover more about our products and solutions

      • Newsroom

        News articles from our press room

  • Contact
Home Blog Embedding Data Files in KNIME Workflows
Blog

Embedding Data Files in KNIME Workflows

Paul Wisneskey
May 13, 2020
  • Share
  • Share

When developing stand-alone workflows for my past KNIME related blog entries, I’ve often distributed their associated source data files by embedding them right in the workflows themselves. I do this based on a little trick I learned when I came across an old KNIME forum post talking about the technique. While I’ve seen other workflows using this technique, I have not seen a step-by-step guide on how to do it. So, for this week’s blog posting, I thought I’d show you how.

While this technique works for any of the file reading nodes, I am going to assume I have an existing workflow that reads a sample CSV file that is located on my desktop. To embed this, I need to copy it into a special folder inside the node’s configuration directory in my workflow’s save location. The easiest way to find the workflow’s save location is to right click on the workflow in the KNIME workbench’s explorer and select the “Copy Location” menu item and then select “Local path” from the sub-menu:

 

This copies the location of the workflow’s save directory to the clipboard so that it can be pasted into a change directory command in your shell:

Now that we are in the workflow’s save directory, we need to find the directory for the CSV Reader node’s configuration files. KNIME names the directories based on their node type and a unique node id number so if you have multiple CSV Reader nodes, it can be a challenge to find the specific node’s directory. To make sure I have the correct node, I enable the workbench option (located in the toolbar) to show the node identifiers in the node titles:

The title of the node is now the same as the node’s configuration directory:

To include a data file with the node’s configuration information, the data file must be copied into a subdirectory named “drop” in the node’s configuration directory:

Now that the data file is copied into the workflow, we need to edit the node’s configuration twice in order to access the embedded data file. I am not exactly sure why it is necessary to edit the configuration twice, but my guess is that the configuration directory of the node is not rescanned until the first configuration edit is saved. For the first edit, I just open the node’s configuration dialog and change the source file so that the dialog has to save its configuration when I close it:

After that configuration is saved, I then immediately open the configuration dialog and this time there is a special flow variable available that is automatically configured with the name of the embedded data file.

It is then just a matter of binding this flow variable to the “Input Location” setting in the dialog (which is called the “url” parameter internally):

Now, when the node is executed it will use the embedded copy of the data file. Furthermore, this data file will stay with the workflow when it is exported or shared to the KNIME Hub. Embedding data files like this makes it easy to share self-contained workflows with no external dependencies that need to be resolved before they can be executed.

Leave a comment Cancel reply

Your email address will not be published. Required fields are marked *

BigBear.ai

2022 BigBear.ai • All Rights Reserved.

  • Privacy Policy
Corporate Headquarters

6811 Benjamin Franklin Drive, Suite 200
Columbia, MD 21046
Tel: 410.312.0885 • Email: [email protected]

CareersContact