Giter Site home page Giter Site logo

ds5111_fall2023_sw2_lab's Introduction

DS5111_FALL2023_SW2_Lab

  1. What did you have to do to get make to work?

The first step is to sudo apt update, then sudo apt install make.

  1. Similarly for python3 -m venv env, what did you have to do? (How likely are you to have guessed that without their clear error message?)

To get this command to work I had to run the following command apt install python3.10-venv. The error message made it very clear that needed to be done. Without it I would have looked up venv requirements are likely would have found something after searching a while but not as fast or effectively as the verbose error message.

  1. Both the pip install on the requirements.txt, and the call to run bin/clockdeco_param.py should be activating the virtual environment first. In other words, there are two bash commands separated by a ;, the first of which activates. Why can't we just do that on a separate line? In other words, why do we have to do that in one line and separate the commands with a ;

In the makefile each line is isolated meaning we can put the commands on different lines if we want them to interact. The semicolon (;) is used to separate multiple commands that are associated with a single target. Therefore, we need to leverage this feature to activate the environment that has the requirements to run the python script. Otherwise we just ran a command to activate the environment then another to run python in a the system's python environment (/usr/bin/python3).

  1. As it is, both the env and tests jobs run differently in that only one runs if the directory exists. This is as intended and all is well. What do you think about the job run? What would happen if you accidentaly had a file called run in your directory? What can we do to fix this?

If we had a file called "run" running make run would return "run" is up to date. The following shows that scenario:

ubuntu@ip-172-31-45-242:~/DS5111_FALL2023_SW2_Lab$ touch run
ubuntu@ip-172-31-45-242:~/DS5111_FALL2023_SW2_Lab$ make run
make: 'run' is up to date.
ubuntu@ip-172-31-45-242:~/DS5111_FALL2023_SW2_Lab$ rm -rf run
ubuntu@ip-172-31-45-242:~/DS5111_FALL2023_SW2_Lab$ make run
[0.12322140s] snooze(0.123) -> None
[0.12322378s] snooze(0.123) -> None
[0.12318110s] snooze(0.123) -> None
  1. The code provided to you for the test file starts with two lines, seemingly to append something to sys.path. What is the purpose of these lines?

sys.path is a built-in variable within the sys module. This variable contains a list of directories that the python interpreter will search in for the required module(s). Using the .append() method allows us to add a specific path for interpreter to search in. The sys.path.append(".") statement in our case is used to add the current working directory - the directory Python script is running in. With that path appended we can then import a module located in the same directory or in our case from the bin folder which contains our main script. I've also seen/used the approach where you add an empty init.py file to the various folders in the project for python to know which are python modules you can import from.

Extra Credit

  1. Execute sudo apt install tree, and use that application to print out the file and directory structure, just as it is shown in this document at the top. You will have to look up in the reading, or google it in stackoverflow, what flag you need to exclude the 'env' directory. No need to cut and paste the structure, just include the full line you used to get it working.

Using the -I flag we can add patterns we want to exclude. For our case its the env directory so the command tree -I 'env' will achieve the intended results.

  1. Your .gitignore has 'env/', and also a callout to ignore the compiled python files, the ones in __pycache__ folders. What is the meaning of the **/*?

A leading ** followed by a slash means match in all directories. An asterisk * matches anything except a slash. Therefore, **/* pattern is used to wildcard match all files and directories recursively meaning we can ignore the complied python files in the __pycache__ folder wherever it is found in the repo rather specifying specific paths to each individual __pycache__ foldern that was generated.

  1. Do a pip list or pip freeze and call out versions of the pytest and pylint packages in your requirements.txt. Include them in your requirements.txt, and for the extra credit, just add a note reminding me you included them.

pip freeze > requirements.txt was used to update the file with specific package versions used in the env venv.

  1. In the sample code from the book, why does the line if __name__=="__main__": allow the script to run if called directly, but not otherwise? What's going on there?

That line allows the program to determine if the script is being run directly as the main program or if it is being imported as a module into another script. __main__ is the name of the environment where top-level code is run. When a Python module or package is imported, __name__ is set to the module’s name. Therefore, we can compare the two to determine if the script is being imported as a module or run if is in fact "main" - a.k.a the driver script.

  1. If you add two print statements, (or any statements for that matter), one above and one below the if __name__... line, what would happen when I do an import of the file? What happens when I call the file directly with python <filename>. Most importantly, why?.

If you add any print statments above or below, but not inside if __name__ ... those will be print to the console when you import the file because the script is executed sequentially when its imported. Any print statments that are within the if __name__ ... will only print to the console if the file is run directly. This is because anything in the if __name__ ... statment will be run if the file is run directly (see response to question 4).

ds5111_fall2023_sw2_lab's People

Contributors

efrainolivaresuva avatar dscer avatar

Watchers

Omkar Bhat avatar  avatar

ds5111_fall2023_sw2_lab's Issues

Journal Entry 2

What were some things you learned in the module? The pylint package in addition to the linters in the preferred IDE is going to be really helpful. I didn't know you could pull the file down, modify it for the linting task, and run the process with the modified file. The ability to suppress lines to the code analyze is good to know as well. I also learned about the github action to automatically run various components of the makefile. Looking forward to more CI/CD work!

What do you think were the most important concepts? In this case the most important concept was automating the execution of pytest and pylint via GitHub actions. This will ensure that all code commits are clean and working properly. With the GitHub actions and these automated tasks, it will accelerate development and create more situational awareness of the state of the repository.

What was challenging for you? How can you learn it better? I thought everything for this lab was straightforward. The reading and directions made it clear what had to be done. I did feel that the pylint output was really verbose and confusion at first compared to the normal stack trace output, but I'm adjusting. I see you can use the -q flag to make the output more concise.

Which parts did you enjoy? I really enjoyed the pylint automation. I integrated the same functionality into a repo I'm currently working on at work. I'm trying to help adopt more coding standards in my division at work and this is something I will leverage in that effort. I enjoyed learning about GitHub actions, however, my organization uses GitLab so I will have to translate what I learn here to my gitlab-ci.yml.

Journal Entry 1

What were some things you learned in the module? I've used python virtual environments before but never generated one in a make file. This will be handy in getting others using my repos up and running a lot faster than writing build documentation. I see why you have to activate the environment and pip install the requirements on the same line this is something that would not have been obvious to me at first. Also, using the pytest module and executing that in the makefile is handy as well. In the past, I have written a Python script to traverse my tests directory to run all test files - the run tests script was executed manually. The method shown in this lab is more concise and streamlined. I will be using what I learned from this lab a lot in the future!

What do you think were the most important concepts? Using automation to create consistent build environments for a repo is one of the main important concepts. The others are how to effectively run tests in a repo, how to use decorators appropriately when importing the functions from other scripts, and understanding the nuances of if __name__....

What was challenging for you? How can you learn it better? I've used decorators for creating Flask/FastAPIs in the past where you need to include decorators such as @app.route('/', methods=['GET', 'POST']) above the endpoint function. However, I have never written my own before so learning where they should be declared was new to me (specifically seeing the 3 nested functions) and something I had to read over in the book a couple of times. After applying this in the lab it helped to solidify my understanding. To learn this better I think I'll need to continue to write some of my own decorators to make sure I can apply this in more complex programs and make sure they are callable.

Which parts did you enjoy? I enjoyed applying the venv creation and pytest automation. I've built my own solutions for this in the past but not as streamlined as what I learned here using the makefile. This will be part of my workflow from now on.

Journal Entry 3

What were some things you learned in the module? I learned that you can use marks to not only skip non-production tests but also evaluate conditional statements. This will be useful for running tests at different stages of deployment without rewriting a bunch of tests. The figure also helped reduce the amount of duplicative code.

What do you think were the most important concepts? Understanding the logical structure of Given/When/Then for the tests as well as using the various decorators to make tests more robust and comprehensive.

What was challenging for you? The figure part was a bit challenging for me, but I find this to be really useful for removing duplicative code.

Which parts did you enjoy? I enjoyed the entire exercise, especially using the parameterize marker to run a series of tests.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.