Giter Site home page Giter Site logo

zthomae / aids-walk-team-fundraising-scraper Goto Github PK

View Code? Open in Web Editor NEW
0.0 2.0 0.0 52 KB

A toy project that periodically scrapes AIDS Walk Wisconsin fundraising data for a team and notifies you of your standing within it.

License: BSD 3-Clause "New" or "Revised" License

HCL 51.75% Python 33.09% Jinja 1.23% Shell 1.93% HTML 12.00%

aids-walk-team-fundraising-scraper's Introduction

AIDS Walk Team Fundraising Scraper

A toy project that periodically scrapes AIDS Walk Wisconsin fundraising data for a team and notifies you of your standing within it.

This code does what it needs to do. That doesn't include being taken very seriously.

What Is It?

Broadly speaking, there are three things that the code in this repo does:

  1. Scrape a team's fundraising data, returning a list of team members and the amounts they've raised in descending order by amount.
  2. Store data from each scrape in a database, so you can Have Fun With It later.
  3. Send an email telling you how much you've raised and where you are in the standings.

Each of these features is implemented within an AWS Lambda function.

This project also defines the infrastructure for deploying these functions to AWS. Terraform is our infrastructure provisioning tool of choice.

Beyond Lambda, this project is almost entirely dependent on AWS services. It uses DynamoDB as its database solution (not that its use cases are really exercising it) and Simple Email Service to send notifications. You will need to verify any email addresses you use.

Project Structure

This project is largely divided into four top-level directories:

  • script/ includes the initialization and deployment scripts you'll need to work with the project.
    • script/init.sh is the first thing you'll want to run to set up a development environment. It will install the development dependencies and seed a .tfvars file that you can use for testing.
    • script/package.sh will create a .zip artifact for the Lambda functions. You can execute this yourself, or you can call it indirectly script/build-and-deploy.sh
    • script/build-and-deploy.sh will, when given the name of an infrastructure deployment (test or prod), refresh the Lambda function artifact and redeploy the infrastructure.
  • src/ includes the code for the Lambda function handlers.
  • tests/ includes the (basic!) unit tests for the handlers. These are not included in the build artifact.
  • infra/ includes the Terraform infrastructure definitions for "test" and "prod" stacks. These both rely on shared infrastructure modules, and are separated from each other largely to make it easy to maintain state separately.

Getting Started

Requirements

Command Overview

To set the project up for local development:

$ script/init.sh

To run the tests:

$ pytest

To update the test snapshots:

$ pytest --snapshot-update

To deploy test infrastructure:

$ script/build-and-deploy.sh test

To deploy production infrastructure:

$ script/build-and-deploy.sh prod

Customizing The Infrastructure

The infrastructure deployments are parameterized by a few settings that most deployments would want to change. These are given in the production and test infrastructure definitions.

To get you started, the test infrastructure has an example terraform.tfvars file. script/init.sh will copy this to infra/test/terraform.tfvars for you to edit.

Development And Production Dependencies

This project contains two Pip requirements files:

  • requirements.txt defines the dependencies that are needed to run the application code on AWS Lambda infrastructure
  • requirements-dev.txt defines the dependencies that are needed to develop and test the application locally

If you use an IDE, you will likely need to configure it to use requirements-dev.txt as your requirements file to have proper navigation and autocompletion. For example, you can find instructions for how to do this in PyCharm here.

A Brief Word About Tradeoffs

This project is not designed to or expected to last very long. This impacts most of the engineering decisions:

  • AWS Terraform providers are used directly with little to no abstraction.
  • The unit tests are relatively sparse, mostly serving to flag accidental regressions during local refactoring. The project was not created with test-driven development.
  • The Python source code is not modularized or packaged into separate artifacts for each Lambda function.
  • There is no infrastructure definition for a deployment pipeline.

License

3-Clause BSD License

aids-walk-team-fundraising-scraper's People

Contributors

zthomae avatar

Watchers

James Cloos avatar  avatar

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.