Giter Site home page Giter Site logo

pillar.gg's Introduction

Pillar

This project is initialized with Ant Design Pro and Vercel.

Environment Setup

You should have node 16 installed: brew install node@16 or use a node version manager.

  1. Install yarn and vercel

    npm i -g yarn vercel
  2. Install dependencies

    yarn --immutable
  3. Link vercel project (must have vercel project connected to repo)

    vercel link
  4. Download the environment variables

    vercel env pull

Start project

vercel dev -l 8000

Troubleshooting

  • Clean yarn local or global cache

    yarn cache clean
    yarn cache clean --mirror
  • Clean umi build artifacts

    rm -r src/.umi*
  • Rebuild dependencies

    yarn rebuild

pillar.gg's People

Contributors

russeii avatar rodneymorgan97 avatar swindesr avatar chand1012 avatar 0xju1ie avatar tedbyron avatar geczy avatar mattmorris92 avatar jw209 avatar yuki-yama avatar

Stargazers

 avatar  avatar

Watchers

 avatar

pillar.gg's Issues

automate different devops workflow for setting up webhook development + handshakes, local dev vs dev.pillar.gg vs production

Description:
In order to subscribe to a user's stream events through twitch (stream.offline event https://dev.twitch.tv/docs/eventsub) we need to do a handshake with Twitch verifying that we own the callback URL. Once the handshake is completed, twitch subscribes that callback url and sends events to it.

That handshake needs to happen every time a new user registers.

  • callback URL should be set as an ENV variable during build time.
  • When developing locally, an ngrok URL should be spun up + set as our ENV callback url automatically.
  • When pushing to develop, we want to have that callback URL be set to dev.pillar.gg/
  • When pushing to master, that callback URL should be set to pillar.gg/

Context:
We might be able to do a more lightweight solution when developing locally (eg do we really need to create an ngrok callback URL every time we develop locally? testing the webhooks should be less than 2% of our total dev effort, so we may not need to create a devops process around making the callback URLs permanently)

add user specific language during user flow for exporting videos

When a user clicks 'combine clips', our backend merges the clips + produces a single video for them, and emails a link to the video.

The video creation process takes 3-15 minutes to process, then they receive an email with a link to the compilation.

We need to make this flow clear to the user by adding a message or email receipt telling the user what the process will be + when they can expect the video

  • Add message for user post clip-creation detailing what will happen next
  • Possibly send an email receipt to user , thanking them for creating a video , and letting them know they will receive the link in 3-15 minutes

Fix playback bar for clips

Improve the playback experience for clips. Currently you can't scrub through the clip. The slider jumps around a lot.

  • implement onSeek functionality
  • fix slider jumping around when changing clips (or seeking)
  • Move clip view to the side (see below)

Switch API token to the Authorization header

Description:
Currently all of our API calls are being done via POST requests and the API key is being passed in via the body of the request. This is bad practice.

image

How to get started:
The authorization system should be modeled after the REST standard, and it explained better here. Convert the API Endpoints so the API key is passed in the header as a base64 string.

Acceptance Criteria:
The API key is now passed to the request via the header, not the body.

Make Twitch Monitoring/YT Uploading Feature Dynamic

Description: allow numbercrunching server to receive all streamers we're tracking, even after adding new ones

How to get started:

  • modify existing tracked users (rocketleague, ludwig, sykkuno, shroud, tommyinnit) in "users" table to contain monitoring:true
  • modify /api/streamersToPoll (in the remyx project) to return streamers that we are monitoring (search for users with monitoring:true)
  • create page (admin access only) on remyx to allow us to specify twitch streamer and youtube account to link it with and adds it to the users table (new api route probably needs to be created)
  • must be in following format
    {
    "_id": {
    "$oid": "5fb806c8dd4cafff3970693e"
    },
    "twitch_username": "rocketleague",
    "twitch_id": "57781936",
    "youtube_credentials": {
    "access_token": "xxxxxxxxxxxxxxxxxx",
    "refresh_token": "xxxxxxxxxxxxxxxxxx",
    "scope": "https://www.googleapis.com/auth/youtube.upload",
    "token_type": "Bearer",
    "expires_in": "3599"
    },
    "monitoring": "true"
    }

Acceptance:

  • Users that are added from new page are appearing in mongodb
  • test with postman and TwitchHighlights to verify all monitored streamers are being added as expected

Add user relation to the "timestamps" collection

Currently the timestamps collection contains all of the timestamps and the video's ID, but I think we should put some sort of information into there that allows us to related the video to a user without having to hit the Twitch API again, like putting the user's Twitch ID or their MongoDB ID in the document. Currently the email service has to hit the Twitch API to get the user with the specified video ID, which contributes to our Twitch API limit.

Switch API Endpoints to fit REST Standard (sort of)

Description:
The REST Standard states that each request should be different in its function depending upon the type you use. A GET request is for getting data, a PUT request is for adding new data, a POST request is for updating existing data, and a DELETE request is for removing data. That being said, most APIs really only use GET and POST, and when they do have all four, GET and POST are by far the most used. I think that we should change certain endpoints over to be GET requests, but I don't see a reason to start using all four request types (yet).

Additional context
As stated in #17 , it is bad practice to have your authorization token in the body of your request. Once that has been addressed, this issue can be addressed as well. Here are the endpoints I think should be changed over:

Basically if the word "get" is in the endpoint, and the only data being send to the endpoint is the API key, I think it should be changed to a GET request.

How to get started:
Convert the above issues to GET requests, unless you have an argument of why the POST request is better.

Acceptance Criteria:
All of the conversions are made.

Integration of community created clips

https://dev.twitch.tv/docs/api/reference#get-videos

Assumptions

  1. ccc are "good" clips
  2. Most viewed ccc are better
  3. ccc are not spam

Potentially useful metrics to filter ccc by:

  • view_count (some kind of ratio between ccc and VOD viewcount or mean VOD viewcount for that specific channel)
  • viewable (maybe private ones are higher quality? if we can figure out the ccc itself)
  • type (Im assuming "highlight" is what ccc is?)
  • created_at (maybe further from VOD create time is better, cuz those people might put more thought into clicking "make ccc" button?)
  • published_at (not sure if this would be a good metric)
  • duration (greater than like 15 seconds? Whatever limit is best for our customers)
  • description (if ccc have description, maybe more likelihood that they are not spam?)

Sync user usage to HubSpot

It would be great to sync a user's usage into HubSpot Reporting so we can get data about our core feedback loop.

  • (optional) Add an Event for streamer.online and streamer.offline
  • Add an Event to measure top of funnel of feedback loop (app logins? )
    • Decide on what single metric to track user coming into Pillar (top of funnel of feedback loop)
  • Add an Event for streamer selecting a VOD to make clips from (to measure Review Clips)
  • Add an Event for streamer exporting a video
  • Add an Event for Asking for Feedback
  • Create a report in Hubspot that shows the funnel (for Customer NPS score)

Signing out and signing back in requires two login attempts

To reproduce:

  • Sign out of Pillar (or clear Local Storage via Chrome Developer Tools)
  • Sign in and authenticate with twitch.

You should be redirected to the root of the site again localhost:8000/ instead of localhost:8000/home.

  • Attempt login again
  • Get redirected to localhost:8000/home.


I believe this has something to do with the timing between:
setting the access token to local storage
https://github.com/pillargg/pillar.gg/blob/1dfb5ed405d8833794ac8c6e32cbc8fb5907dd68/src/pages/TwitchAuth.tsx#L12
calling the function that uses the access token to get twitch data and setting it to initialState.currentUser
https://github.com/pillargg/pillar.gg/blob/1dfb5ed405d8833794ac8c6e32cbc8fb5907dd68/src/app.tsx#L16-L32
and this check that's looking for a currentUser before allowing the user to redirect to /home
https://github.com/pillargg/pillar.gg/blob/1dfb5ed405d8833794ac8c6e32cbc8fb5907dd68/src/app.tsx#L44-L48

Dynamically delete chat messages after uploading to YT

Messages should be deleted from Mongodb Database once the video has been uploaded

A C : make a cron job that runs daily and deletes all messages that are 1+ week old . make a /cleanup api endpoint that does the deleting, and have cronjob hit that endpoint once a day.

Setup DataDog for CDK

Right now, it's kind of a bad dev experience to debug a failed pillar request:

  1. Invoke a lambda to test your desired change
  2. Wait to see if it ran successfully
  3. If it didn't, check logs for all the lambdas in the flow to find out where in the pipeline the request failed
  4. Repeat

Step 3 is the big friction point.

Russell mentioned that cdk may have something built into it that would allow us to expose the failure point in a pipeline instance, and show logs for it easily.

AC:

  • Do research into CDK to see if this is possible
  • (optional) update #backend-team channel with research findings
  • Implement + test

Clips need a rating a system

Data team needs feedback on clips. Maybe a separate "testing" page vs a "presentation" page? For testers vs users.

Ideas:

  • Each clip rated as "smiley" :), "frowny" :(, or "spam".
  • To evaluate spam, chat next to vid might be useful (#42) (IE: chat is spamming and this was caught by algo, and not clip was not good)

To avoid bias in rating:

  • Clips need to be ordered randomly (not by top 5)
  • Algos need to be randomly presented (not one row per algo, but mix clips from diff algos in the same row)
  • Timestamps not shown

Tinderify the rating?

  • Show 4 clips at a time total on the page
  • Each clip is from a different algo
  • After one clip is rated, that clip's card disappears (dopamine checkmark?)
  • After the row is cleared, brand new 4 clips from each algo is shown
  • Repeat

modify vercel api to handle new google account credentials

Description: Vercel API will have outdated credentials (unsure about TwitchHighlights but it may be in the same boat)
How to get started:

  • change Public/Private keys on vercel and .env file for TwitchHighlights (be sure to do a vercel env pull for local testing on remyx)
  • test login flow after pushing to develop branch of remyx

Accepted:

  • OAuth succeeds and database is updated properly
  • videos can be uploaded without errors

Combined video is never made.

When calling API gateway with the following data:

"clips": [{"startTime": 60, "endTime": 95}, {"startTime": 100, "endTime": 110}],
"videoId": "964746682"
}

It gives a successful response and kicks off the step functions. However, new mediaconvert jobs are not getting started.

Add hubspot chat

Add chat icon to landing page and maybe our app itself so we can get feedback and chat with custoemrs

Add Stripe checkout to video export flow

We want to charge users $0.99 per video export.

  • Add a product to our Stripe account
  • Integrate a custom stripe checkout flow inside the 'Video Export' PopConfirm.
    • When user clicks 'Combine Selected Clips', we should ask them for their CC detail if we don't have it already. Once we have CC detail, confirm that they want to spend $0.99 on exporting the video. Then checkout the user, and actually export the video (to their email)
  • Save user credit card so next time they use us, we don't have to ask them for their CC details again
  • (optional) Create a 'Account' page on the left nav bar (underneath 'discord'), and allow the user to see their billing history, saved payment method

create new project for youtube so we can upload videos now

Description: make dev account for pillargg gmail and subscribe to free tier, then change remyx/TwitchHighlights code to use new info
How to get started:

  • make dev account
  • configure Youtube API OAuth routes to match old account's routes (change dev.clipclock.stream => dev.pillar.gg)

Accepted:

  • OAuth succeeds and database is updated properly
  • videos can be uploaded

Verify that emails are being sent when exporting video

We have the backend code to merge clips + send an email to the user, but we haven't tested it.

Lets verify it actually works -

  • Post a picture to #frontend-team slack verifying that the email was sent and works as intended

Receiving a status.200 response when click 'export video' button takes way too long

When a user clicks the 'Combine Selected Clips' button, and then clicks the Export Video button, we wait to show them a success message clientside until we get a 200 response from the endpoint. It usually takes like 10-30 seconds, but I feel like it should only take 1-2 seconds to return a 200 response to the client.

In this loom video, it actually takes 1-5 seconds to show the success message. But it seems like when many more clips are selected, it can take up to 30 seconds to get the 200 response from server:
https://www.loom.com/share/b1aa8b231ac94be49f2b1dff8247d0f5

Verify that all the pages load without crashing

This is a final check issue before we go live to streamers.

  • Do a run through of the 'happy path' for a user to make sure pages work as intended
  • Click around to make sure everything works and doesn't crash
  • Take notes on small improvements to fix/make (like the favicon), and add it to one issue

Integrate CCCs into frontend

We want users to be able see clips from a few sources (our own algorithms, as well as Community Created Clips)

Right now, the CCCs arent integrated into the front end (even though we compute them on the backend).

We want to integrate the CCCs into the front end , and somehow make it clear to the user that these clips come from the community

AC:

  • Show CCCs on front end in the list of clips
  • Somehow make it clear that the clips are from Twitch community (color them purple? add a tag to the card? )

Don't re-auth unless needed

Right now it reauths everytime the user clicks "sign in with twitch" but we can check the users cached token and take them straight into the app if its valid

Add way to save users editing progress

We could save client side, but probably makes sense if we save it server side so we can analyze this data in the future and it persists for their account instead of for their computer.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.