Giter Site home page Giter Site logo

aticc's Introduction

Assessment of Trusted Internet Connection 3.0 Compliance (ATICC)

Join the chat at https://gitter.im/ATICC/community


The Goal of this project is

Design and implement a dashboard and test suite that assesses the Software Define Perimeter (SDP) environment's compliance against the TIC 3.0 requirements.

Contact Information

TBD

aticc's People

Contributors

dyiop avatar gitter-badger avatar imichaela avatar jedshakarji avatar jweissm avatar nikitawootten-nist avatar samuelhoward avatar selenaxiao avatar

Stargazers

 avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

aticc's Issues

IP Denylisting InSpec Profile

User Story:

IP Denylisting are one of the 5 basic TIC capabilities, protecting against the ingest or transiting of traffic to/from a denylisted IP address.

Goals:

InSpec profiles for the following IP denylisting scenarios:

  • Transiting: blue/green services should not be able to access any other services on the network
  • Egress: Hosts on the inside should not be able to access denylisted external IPs
  • Ingest: External IP addresses should be denied.

For more information, refer to the TIC 3.0 Testing Outline document.

Dependencies:

None...

Acceptance Criteria

  • InSpec profiles exist with clear documentation as to how each control satisfies the testing outline.

Dashboard: Ping Test False Positives

Describe the bug

The dashboard displays false positives for the ping test in certain situations, such as when the coordinator is not running.

Who is the bug affecting?

ATICC Users.

What is affected by this bug?

The ping test appears to pass when it should be failing, meaning the ping test result viewable in the dashboard are not completely reliable.

When does this occur?

When the dashboard attempts to run a ping test but the coordinator is not running.

How do we replicate the issue?

  1. Ensure the coordinator is not running locally.
  2. Run the dashboard maven project.
  3. Open the dashboard at http://localhost:8080/
  4. Travel to the testing page.
  5. Run a ping test with any parameters.
  6. See a false positive result.

Expected behavior (i.e. solution)

When the coordinator is not running, all tests should fail.

Dashboard: Edit and Delete

User Story:

As an ATICC user, I need to be able edit and delete existing instance of tests.

Goals:

Currently we can only add test to the Dashboard and the only way to delete them is to re-run the whole dashboard. We need to add the ability to both edit and delete existing test to avoid this tedious workaround.
Editing will require some planning because we will need to change the UI a bit to add an "editing" section. Delete should be relatively straight forward.

Dependencies:

NA

Acceptance Criteria

  • The UI and functionality to entirely delete existing tests.
  • The UI and functionality to edit existing tests.

Document running SDP client docker container (standalone and in test suite)

User Story:

As an ATICC user, I need clear guidance on how to use the SDP client in order to run the test suite.

Goals:

  • Documentation in the form of a README for building the SDP Client and running it, including which directories/input files to pass in.
  • Updated documentation for the InSpec profiles that includes instructions for starting the SDP client container(s)

Dependencies:

None

Acceptance Criteria

  • Updated documentation

Coordinator: Test Execution

User Story:

As a ATICC Developer, I need to know how the coordinator will execute the tests.

Goals:

Along with communicating with the dashboard, the coordinator should be able to execute the tests. This entails:

  • Creating and destroying any necessary Client VM's for each test.
    • The coordinator should also be able to facilitate any necessary communication between the two clients.
    • The coordinator will create the Client VM's though docker.
  • Actually execution of tests.
    • Once each Client VM is created, the coordinator will run the tests on the Client VM's.
    • The coordinator will be only place with the actual implementation of each test, although both parts (dashboard and coordinator) will need to know the meta information for the tests in order to communicate correctly.

Dependencies:

  • Planning:
    • Decide what kind of tests are needed. (#3)
      • Decide which configuration parameters each one needs.
      • Decide on implementation tactics for each one.
    • Decide what OS the VM Client Images will be based on
  • Development:
    • Add docker driver to GO project (#10)
    • Make Client VM Images
      1. Good Guy Client
        • Install/Configuration SDP client
          • Install SDP Client
          • Get SSH access to gateway
          • Configure SDP Client
      2. Bad Guy Client
        • No further configuration necessary
    • Write actual test's
      • For each test we will have to spawn various clients and run various commands
    • Link it to the REST api
      • Run the corresponding test (with the given configuration) when each REST request is received.

Acceptance Criteria

  • [ ]

Dashboard: Ping Tests Run Twice

Describe the bug

Any ping test issued from the dashboard is executed twice on the coordinator. This is separate from the amount parameter in the issued ping test, e.g. A ping test with amount parameter 3 will ping three times but the ping test will run twice (So the IP will be ping 6 times). This is unintended behavior.

If the coordinator is communicated with using something other than the dashboard, ping tests will not be duplicated. This isolates the problem to the dashboard.

Who is the bug affecting?

ATICC Users.

What is affected by this bug?

This bug is causing the coordinator to fulfill twice as many ping tests with only half of those tests contributing to results viewable via the dashboard.

When does this occur?

This bug occurs when any ping test is run from the dashboard, whether it passes or fails.

How do we replicate the issue?

  1. Run the coordinator locally.
  2. Run the dashboard locally.
  3. Ensure the coordinator URL in the dashboard is accurate.
  4. Run any ping test from the dashboard.
  5. View the log in the terminal window where the coordinator was executed.
  6. See that the test was executed twice.

Expected behavior (i.e. solution)

Each ping test should only be run once rather than twice.

Base Tests

User Story:

As an ATICC user, I need to easily create each each type of base test to fulfill my overall testing outline.

Goals:

A user should be able to easily add and remove various base tests via the dashboard. A user will use this capability to implement their testing outline in the dashboard itself. The execution code for each test will be in the coordinator, and the response will be analyzed in the dashboard to determine whether the test passed or failed. The user will then be preseneted with an explanation of result.

Dependencies:

  • Finalize list of base tests
  • Coordinator Things
    • Implement each base test (#6)
    • Make a new rest endpoint for each base test (#6)
  • Dashboard Things
    • Make handler for each test
      • Send corresponding REST Request
      • Analyze response to determine pass/fail and explanation

Acceptance Criteria

  • Dependencies completed
  • Usage testing done to discover new bugs and features.

Incorporate InSpec parameters into all InSpec profiles

User Story:

As an ATICC user, I want to be able to easily change values used within the InSpec profiles. This should be doable in one place. An ATICC user should not have to change the actual profiles when changing variable values.

Goals:

The parameters defined in /Profile/input_file.yml should be utilized in all profiles under /Profile/

Acceptance Criteria

  • Access Control Profile incorporates relevant InSpec parameters
  • Host Containment Profile incorporates relevant InSpec parameters
  • Microsegmentation Profile incorporates relevant InSpec parameters
  • Network Segmentation Profile incorporates relevant InSpec parameters
  • IP Denylisting Profile incorporates relevant InSpec parameters

Generalizing the TIC 3.0 Testing Outline Table

User Story:

As an ATICC developer, I would like our planned tests to be applicable to any implementation satisfying TIC 3.0 guidelines. Currently, the descriptions within the TIC 3.0 Testing Outline Table include details of the ATARC implementation rather than being agnostic of implementation.

Goals:

All reliance on the ATARC implementation within the TIC 3.0 Testing Outline Table needs to be removed. Instead, the test descriptions should rely on TIC 3.0.

This involves being able to test for "all" cases of a certain test. For example, we use a port scan to check all of the ports, as opposed to simply pinging the ports we know may be open.

Dependencies:

No Dependencies

Acceptance Criteria

  • All reliance on ATARC implementation is removed from testing column (column H)
  • All tests should accurately describe the test without utilizing ATARC details

Network Segmentation InSpec Profile

User Story:

Network Segmentation is one of the 5 basic TIC capabilities. It ensures the network is made up of subnetworks with boundaries between one another, restricting unauthorized traffic between them.

(User Story modeled after #26)

Goals:

InSpec profiles for the following IP Network Segmentation scenarios:

  • Ingest: The gateway should reject unauthorized traffic attempting to enter the network
  • Transiting: The gateway should reject unauthorized traffic between blue/green services
  • Egress: The gateway should reject traffic within the network from accessing an external unauthorized domain

For more information, refer to the TIC 3.0 Testing Outline document.

Dependencies:

None

Acceptance Criteria

  • Network Segmentation InSpec profile exists with clear documentation as to how it satisfies the testing outline.

List of Test Types

User Story:

As an ATICC Developer, I need to define what types of tests are going to be implemented.

Goals:

Looking through the testing outline, make a concise but complete list of test types that the Dashboard can initiate and the Coordinator can execute. For each test we need this information:

  • Name: A short name which describes the test
    • Example: Ping Test
  • Execution Parameters: Configurable parameters which lets the test type be used for many actual tests. Each Parameter should have a name and data type
    • Example:
      1. IP: String
  • Execution Process: The actual plan on how to implement this test.
    • Example: Ping the IP address given in parameter IP
  • Completion Data: The data received at the end of the test, which will then be sent to dashboard for analysis.
    • Example:
      1. Success: Boolean

Dependencies:

  • Further divide rows for tests that need to be split up (e.g. ingest, transit, egress & positive/negative results)
    • During development, keep in mind some tests will have a positive case and a negative case
  • Broaden the tests to be more thorough (e.g. tests do not depend on ATARC implementation, generalize) (#9)
  • Complete the 'SDP Config Check' column (#8)
  • Fill in how the pilot addresses each TIC requirement in the 'The Pilot Solution' column
  • Assign a base test to each row. (Ex. Packet + Request for Access Control - Ingest)

Acceptance Criteria

  • A complete list of test types, with the specified info for each one, stored somewhere in the Docs folder
  • Should be able to cover all of the actual tests listed in the testing outline.
  • Dashboard should have empty placeholders for each test type.

Dashboard: Add Coordinator Status Indicator

User Story:

As an ATICC User, I need to be able to check whether the dashboard is able to communicate with the coordinator using the coordinator URL

Goals:

It should be possible to check that the dashboard is able to communicate with the coordinator without running a TICC test. This way, an ATICC user can check if the coordinator is running and that their coordinator URL (in the settings tab) is accurate.

This can be done with some kind of button on either of the testing page or settings page.

Dependencies:

N/A

Acceptance Criteria:

  • Add a button on either the settings tab or testing tab that sends a REST request to the status endpoint of the coordinator.
  • Add a visual indicator to the same page as the button that shows if the status of the coordinator is ok.

Dashboard-driven Test Suite Architecture

User Story:

As an ATICC developer, I need to have a well-defined architecture of the Dashboard-driven testing suite.

Goals:

The Dashboard-driven test suite's architecture is final so the implementation can commence.
It should define all of information necessary to complete development including, but not limited to:

  • The overall architecture (Architecture.md)
  • Test Types ()

Dependencies:

#3 List of Test Types

Acceptance Criteria

  • The architecture supports all use case scenarios that demonstrate TIC 3.0 functionality.
  • All the parts listed above are complete to reflect the current state of the architecture. Each are stored in the listed file in the Doc folder.

Dashboard: Add Layer to Handle Expected Results

User Story:

As an ATICC user, I want to be able to establish expectations for whether an individual test should pass or fail. In addition, I should be able to see how these expected results compare to actual results and see an explanation of this relationship.

Goals:

{A clear and concise description of what you want to happen. This should be outcome focused. Include concise description of any alternative solutions or features you've considered. Feel free to include screenshots or examples about the feature request here.}

Dependencies:

#13 must be resolved first.

Acceptance Criteria

  • The testing tab within the dashboard should have a UI for adding expected results to existing tests.
  • The Action class should have a field for storing the expected result of a test.
  • After a test is run, the UI of the testing page should reflect whether the actual result matches the expected result, along with the information corresponding to the expected and actual result.
    • The comparison of the expected result and actual result as well as the handling of the UI must be done in a method of the Action class called analyze().
  • The existing explain() method should now reflect whether the actual test result is the same as the expected result as well as the information it currently reflects (the current explain() method reflects information regarding if a test passed or why a test failed).

Dashboard: Subsequent Explanation/Analyzation

User Story:

As an ATICC developer, I need to be able to set the explanation of pass/fail directly after running the test.

Goals:

Currently the (in the Action.java interface) the run() and explain() methods are separated. This means that the explanation method has to re run the test just to analyze it and give an explanation for pass/fail. Either:

  1. embed this functionality in the run method. In this case we would remove the explanation method entirely and just store the explanation as a string in the the Action.java interface
  2. store the response from the coordinator to analyze later. In this case we should rename explain() to analyze() because it should serves both the purpose of determining pass/fail and setting the explanation as for why.

Dependencies:

N/A

Acceptance Criteria

  • Choose which way to go (1 or 2)
  • Implementation todo's:
    • Task 1...

Dashboard: Expanding the sendRequest method

User Story:

As an ATICC developer, I want to be able to send REST requests from the dashboard that are PUT, PATCH, or DELETE requests.

Goals:

The sendRequest method in the RestService class should be able to send PUT, PATCH, and DELETE requests. Currently, the method only supports GET and POST requests. This functionality can be implemented by expanding the If statement, using a different conditional statement, or using a more general method in the RestTemplate library.

Dependencies:

NA

Acceptance Criteria

  • Add functionality for sending PUT requests in sendRequest method of RestService class.
  • Add functionality for sending PATCH requests in sendRequest method of RestService class.
  • Add functionality for sending DELETE requests in sendRequest method of RestService class.

Coordinator Communication with Dashboard

User Story:

As a ATICC Developer, I need to know how the Coordinator and Dashboard will communicate.

Goals:

Create and define the REST API that will be used by the dashboard in order to:

  1. Start the the various tests.

  2. Get the results of the tests from the point of view of clients. The Dashboard can then consult the SDP controller to further assess the results of the tests.

Here is a PDF further describing the flow in more detail.

Dependencies:

  • Planning:
    • Decide what exact rest API endpoints we're going to need.
      • Set up which tests are going to have which configurations (#3)
  • Development:
    • Create the GO project
    • Create the GO REST Server
    • Implement REST communication in Java for sending requests to the coordinator from the dasboard
    • Set Endpoint names (one for each test?), and which configuration parameters they are going to have
      • Endpoint 1
      • Endpoint 2
      • Endpoint 3

Acceptance Criteria

  • [ ]

Testing Outline: Complete the 'SDP Config Check' column

User Story:

As an ATICC Developer, I need to know what component(s) of the SDP configuration within the SDP controller is relevant for each of the planned tests.

Goals:

For each test in the TIC 3.0 Testing Outline table, Find which part of the SDP configuration (visible from the SDP Controller) can be used to determine if the test has passed or failed.

Dependencies:

No dependencies.

Acceptance Criteria

  • All rows in the TIC 3.0 Testing Outline table have the 'SDP Config Check' column completed.

Testing ssh without simply running it

  • Simply running ssh will test more than whether ssh is being served. It will also test that we are using the right/wrong keys amongst other things.
    • This will become a big deal when we are expecting it to fail as a BGC, and it is failing for reasons other than we expect it to.
  • Just maybe check if port 22 responds to an ssh header? <-- figure out how to do this correctly

Coordinator: Go Driver

User Story:

As an ATICC developer, I need to be able to easily all the docker things:

  • Check whether an image exists
  • Run a given image
  • Capture output
  • Check Response Code
  • Kill in container gracefully

Goals:

{A clear and concise description of what you want to happen. This should be outcome focused. Include concise description of any alternative solutions or features you've considered. Feel free to include screenshots or examples about the feature request here.}

Dependencies:

{Describe any previous issues or related work that must be completed to start or complete this issue.}

Acceptance Criteria

  • [ ]
  • [ ]

{The items above are general acceptance criteria for all User Stories. Please describe anything else that must be completed for this issue to be considered resolved.}

Coordinator: Semantic Error Messages

User Story:

As an ATICC developer, I need a clear way to view the error that are being made in order to easily diagnose bugs.

Goals:

As of right now, the go code only return error status codes, and is harder to debug. For cleanliness we should add a universal error logging system so we can write always write errors to it.

Dependencies:

none

Acceptance Criteria

  • Create file structure to house go code and log files
  • Implement logging methods in a way that it can be used everywhere.

Dashboard: Refresh Test Button

Describe the bug

The Dashboard refreshes and runs each test when a new one is added. Instead make a refresh button to do this.

Who is the bug affecting?

ATICC Users.

What is affected by this bug?

It will make the runtime long for adding a new test (if every other test has run again)

How do we replicate the issue?

  1. Add test
  2. Add another test
  3. See error

Expected behavior (i.e. solution)

A button should be made which has the same functionality for refreshing. The refreshing functionality should be taken out by default when adding a new test.

GitHub issue templates

User Story:

As an ATICC developer, I I need tailored GitHub templates.

Goals:

  • Templates for feature requests, bug reports and questions specific to the ATICC project

Dependencies:

None

Acceptance Criteria

  • Templates exists

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.