Giter Site home page Giter Site logo

ericjmarti / inventory-hunter Goto Github PK

View Code? Open in Web Editor NEW
1.1K 43.0 264.0 1.1 MB

⚡️ Get notified as soon as your next CPU, GPU, or game console is in stock

License: MIT License

Python 87.09% Dockerfile 0.65% Shell 7.33% PowerShell 3.68% JavaScript 1.24%
nvidia amd intel rtx3080 rtx3070 docker bot rtx3090 python raspberrypi

inventory-hunter's Introduction

Inventory Hunter

Build Docker Pulls Docker Image Size (tag)

This bot helped me snag an RTX 3070... hopefully it will help you get your hands on your next CPU, GPU, or game console.

Requirements

  • Raspberry Pi 2 or newer (alternatively, you can use an always-on PC or Mac)
  • Docker (tutorial)

You will also need one of the following:

Quick Start

For instructions specific to Windows, please see this guide instead: Instructions for Windows

These steps should work on any supported Docker platform, but they have been specifically tested on Raspberry Pi OS with Docker already installed.

  1. Clone this repository and pull the latest image from Docker Hub:

    pi@raspberrypi:~
    $ git clone https://github.com/EricJMarti/inventory-hunter
    
    pi@raspberrypi:~
    $ cd inventory-hunter
    
    pi@raspberrypi:~/inventory-hunter
    $ docker pull ericjmarti/inventory-hunter:latest
    
  2. Create your own configuration file based on one of the provided examples:

  3. Start the Docker container using the provided docker_run.bash script, specifying the required arguments.

    If using Discord or Slack, the format of your command will look like this:

    $ ./docker_run.bash -c <config_file> -a <discord_or_slack> -w <webhook_url>
    
    # Discord example:
    pi@raspberrypi:~/inventory-hunter
    $ ./docker_run.bash -c ./config/newegg_rtx_3070.yaml -a discord -w https://discord.com/api/webhooks/...
    

    If using an SMTP relay, the format of your command will look like this:

    $ ./docker_run.bash -c <config_file> -e <email_address> -r <relay_ip_address>
    
    # SMTP example:
    pi@raspberrypi:~/inventory-hunter
    $ ./docker_run.bash -c ./config/newegg_rtx_3070.yaml -e [email protected] -r 127.0.0.1
    

Getting New Code

  1. First identify any running container names related to inventory-hunter
    $ docker ps
    
  2. Stop and remove all containers related to inventory-hunter
    $ docker stop CONTAINER_NAME
    $ docker rm CONTAINER_NAME
    
  3. Pull repo updates
    $ git pull
    
  4. Rerun the docker_run.bash command to start containers back up with updates.
    $ ./docker_run.bash -c <config_file> -a <discord_or_slack> -w <webhook_url>
    

Configuring Alerters

If you are interested in configuring multiple alerters or would like to keep your alerter settings saved in a file, you can configure inventory-hunter's alerting mechanism using a config file similar to the existing scraper configs.

  1. Create a file called alerters.yaml in the config directory.

  2. Configure the alerters you would like to use based on this example:

    ---
    alerters:
      discord:
        webhook_url: https://discord.com/api/webhooks/XXXXXXXXXXXX...
        mentions:
          - XXXXXXXXXXXXXXX
          - XXXXXXXXXXXXXXX
      telegram:
        webhook_url: https://api.telegram.org/botXXXXXXXXXXXXXXXXXXXX/sendMessage
        chat_id: XXXXXXXX
      email:
        sender: [email protected]
        recipients:
          - [email protected]
          - [email protected]
        relay: 127.0.0.1
        password: XXXXXXXXXX   # optional
      slack:
        webhook_url: https://hooks.slack.com/services/T00000000/B00000000/XXXXXXXXXXXXXXXXXXXXXXXX
        mentions:
          - XXXXXXXXXXXXXXX
          - XXXXXXXXXXXXXXX
    ...
    
  3. Add this config file to your run command:

    pi@raspberrypi:~/inventory-hunter
    $ ./docker_run.bash -c ./config/newegg_rtx_3070.yaml -q ./config/alerters.yaml
    

How it works

The general idea is if you can get notified as soon as a product becomes in stock, you might have a chance to purchase it before scalpers clear out inventory. This script continually refreshes a set of URLs, looking for the "add to cart" phrase. Once detected, an automated alert is sent, giving you an opportunity to react.

FAQ

How is this different from existing online inventory trackers?

Before developing inventory-hunter, I used several existing services without any luck. By the time I received an alert, the product had already been scalped. This bot alerts faster than existing trackers for several reasons:

  • it runs on your own hardware, so no processing time is spent servicing other users
  • you get to choose which products you want to track
  • you are in control of the refresh frequency

What if inventory-hunter gets used by scalpers?

I sure hope this doesn't happen... 2020 and 2021 are bad enough already. My hope is that inventory-hunter levels the playing field a bit by giving real customers a better opportunity than they had previously. Serious scalpers will continue using automated checkout bots, and it is up to online retailers to combat this malarkey.

Do I really need Docker?

No, but I highly recommend it. If you know your way around python and pip/conda, then you should be able to replicate the environment I created using Docker.

inventory-hunter's People

Contributors

aaronshirley751 avatar bandits13 avatar burito55 avatar dandanio avatar dareed avatar electroskull57 avatar ericjmarti avatar fontune avatar frantisekbrabec avatar g1n93r avatar gu1ll0me avatar hartmms avatar itsmylife44 avatar jbowring avatar mrizkbv avatar ms7m avatar phyltr avatar raymondarias avatar rushyrush avatar thecyberbutler avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

inventory-hunter's Issues

Repeating notifications on alert

Hello and thanks for the helpful tool! The instructions were straightforward and I had a spare RPi up and running in a short time. An item in my config must have come available for a short while today because I received a notification. Actually I received 79 emails on the same topic with a range of 1-5 mins apart.

My config is modeled after the checked in 6800 XT config. No price limit (which I will change!) and 2s scrape time. In postfix I added an alias that can send a text via email to @mms.att.net. I started the docker with my default email and the alert alias.

Any thoughts on what is happening? What other info could I provide?

Thank you!

Page crash after running for brief time.

When I build docker image from latest, my newegg container runs for a bit, then gets a page crash error, and then all subsequent scrapes fail.

My old docker container from a previous build w/ newegg config still works fine.

Running my bhphoto config with latest build works just fine also.

Full log file here:
ba0bae8423d168c965be026f7338fcbdf5c5bfd9a48e4a584d6fccdbef2c4109-json.log

E2020-11-29 13:43:51,824 gigabyte-geforce-rtx-3070-gv-n3070eagle-oc-8gd: caught exception during request: Message: unknown error: session deleted because of page crash
from unknown error: cannot determine loading status
from tab crashed
  (Session info: headless chrome=83.0.4103.116)

E2020-11-29 13:43:51,824 gigabyte-geforce-rtx-3070-gv-n3070eagle-oc-8gd: scrape failed
E2020-11-29 13:43:53,610 gigabyte-geforce-rtx-3070-gv-n3070gaming-oc-8gd: caught exception during request: Message: invalid session id

E2020-11-29 13:43:53,610 gigabyte-geforce-rtx-3070-gv-n3070gaming-oc-8gd: scrape failed

Basic Docker Commands

Can someone do a quick list of common Docker Commands for people like myself have a reference? For example, see what containers are active, how to stop 1 or all of them, how to remove them, etc... I am pretty new to this but I think not for only myself but others this will be great to have for others to be able to make sure the containers are running and how long they have been up, etc... Also anything you else might suggest as nice to know when up and running.

Thank you to anyone helping on this project. Never thought I would be dabbling in this just to get a darn gpu.

Added some addresses if interested

Here are some addresses for the AMD 6000 series if interested:

urls:


urls:


urls:

...


urls:


refresh_interval: 2 # seconds
urls:


refresh_interval: 2 # seconds
urls:

...

There are a couple new ones in newegg that i found but most are just duplicates from you.

Docker_Run.Bash Example

I have completed all the steps and set everything up and confirmed I am getting the email when testing through a command "sendmail". 2 quick questions and I should be up and running. Any help would be greatly appreciated.

  1. I tested out gmail set up and I do get an email from "sendmail" command, but when I try to restart postfix, it tells me that "sudo: systemct1: command not found. So email is working, but not sure Postfix is actually running behind the scenes. Thoughts?

  1. I am wondering if someone can give a sample layout of the final Command. I want to make sure I am formatting correctly. I updated my yaml files and saved them down. Not sure how to point to the Yaml files in the command line that are saved in a folder. Also I did GMAIL so I assume to use 127.0.0.1 for Relay ip.

$ ./docker_run.bash -c <config_file> -e <email_address> -r <relay_ip_address>

This is awesome and thank you everyone for helping me as I am 'newer" to this coding but learning. Thank you for all you have done also Eric!

Updating Code Tutorial

Hello,
I am trying to update my current system with your new updates but not sure how to go about this. Any help would be greatly appreciated. (Would be nice to see this in the readme but mostly just because I'm a beginner lol)

when scraping amazon there is an exception

these re the links i'm trying to scrape

---
refresh_interval: 2 # seconds
urls:
  - 
  - https://www.amazon.com/ASUS-Graphics-DisplayPort-Military-Grade-Certification/dp/B08HH5WF97
  - https://www.amazon.com/GeForce-Gaming-Graphics-Technology-Backplate/dp/B08NWBG14N
  - https://www.amazon.com/MSI-GeForce-RTX-3080-10G/dp/B08HR7SV3M
  - https://www.amazon.com/ASUS-Graphics-DisplayPort-Axial-tech-2-9-Slot/dp/B08J6F174Z
...

D2020-12-03 02:21:49,882 using parser: html.parser
D2020-12-03 02:21:49,884 registering custom scraper for domain: amazon
D2020-12-03 02:21:49,890 registering custom scraper for domain: bestbuy
D2020-12-03 02:21:49,896 registering custom scraper for domain: bhphotovideo
D2020-12-03 02:21:49,902 registering custom scraper for domain: microcenter
D2020-12-03 02:21:49,910 registering custom scraper for domain: newegg
E2020-12-03 02:21:50,006 caught exception
Traceback (most recent call last):
  File "/src/run.py", line 41, in main
    config = parse_config(args.config)
  File "/src/config.py", line 37, in parse_config
    return Config(refresh_interval, max_price, data['urls'])
  File "/src/config.py", line 23, in __init__
    self.urls = [URL(url) for url in sorted(set(urls))]
TypeError: '<' not supported between instances of 'NoneType' and 'str'

Bestbuy stating Scrap Failed

So now this runs perfectly and I get my alerts via email. But for Best Buy, It says scrape failed. It says caught deception during request: got response with status code 403 for https://www.Bestbuy...

Any recommendations? I got the latest files just now and tried it but it didn't work. Thanks

Getting an issue for the email sending to myself any help is appreciated

I2020-11-30 23:48:45,139 gigabyte-geforce-rtx-2070-super-gv-n207swf3oc-8gd: now in stock at 859.9!
E2020-11-30 23:48:45,610 caught exception
Traceback (most recent call last):
File "/src/run.py", line 32, in main
hunt(args, config, driver)
File "/src/hunter.py", line 121, in hunt
engine.run()
File "/src/hunter.py", line 43, in run
self.scheduler.run(blocking=True)
File "/usr/local/lib/python3.9/sched.py", line 151, in run
action(*argument, **kwargs)
File "/src/hunter.py", line 58, in tick
self.process_scrape_result(s, result)
File "/src/hunter.py", line 95, in process_scrape_result
self.send_alert(s, result, f'now in stock at {current_price}!')
File "/src/hunter.py", line 116, in send_alert
self.alerter(result.alert_subject, result.alert_content)
File "/src/hunter.py", line 29, in call
s.send_message(msg)
File "/usr/local/lib/python3.9/smtplib.py", line 970, in send_message
return self.sendmail(from_addr, to_addrs, flatmsg, mail_options,
File "/usr/local/lib/python3.9/smtplib.py", line 885, in sendmail
raise SMTPRecipientsRefused(senderrs)
smtplib.SMTPRecipientsRefused: {'[email protected]': (454, b'4.7.1 [email protected]: Relay access denied')}

Docker Container Exiting Immediately

When running the given ./docker_run.bash command I get a string with a bunch of characters (which changes every time I run it) and it appears to exit.

When I check to see if any docker containers are currently running (with docker ps) nothing shows to be running but when running docker ps --filter "status=exited" it shows the containers have already exited.

Is there a log file for errors to look to see what the issue is?

Relay IP Address

First of all Thank you Eric! I am fortunate to have a raspberry pi laying around that I am not using for Retro Pi and this is a great idea. I have been trying for 2 months to get a 3080 with no luck. I am hoping this helps me obtain a 3080 as my 1080 is feeling its age. With that being said, I have dabbled in this stuff a little bit with some work stuff and also retro pi, but I was curious on the "Relay IP Address". Is this the IP assigned to my raspberry pi or my main WAN IP or am I missing it entirely? Thanks in advance!

[Question] is it possible to run this on an Ubuntu cloud server?

I've been trying to run this on Ubuntu, and everything seems to be working fine up until it detects an item in stock, and then I get errors. I've tested the SMTP relay and Docker and they both seem to be working fine. I don't have a raspberry pi but I already had the Ubuntu server for another project and figured it could do.
After I run the program with ./docker_run.bash -c /root/inventory-hunter/config/newegg_rtx_3070.yaml -e [email protected] -r [smtp.gmail.com]:587, the logs look normal until it gets to the 2070 Super, then it logs it as in stock and then I get the following errors:

Traceback (most recent call last):
  File "/src/run.py", line 32, in main
    hunt(args, config, driver)
  File "/src/hunter.py", line 121, in hunt
    engine.run()
  File "/src/hunter.py", line 43, in run
    self.scheduler.run(blocking=True)
  File "/usr/local/lib/python3.9/sched.py", line 151, in run
    action(*argument, **kwargs)
  File "/src/hunter.py", line 58, in tick
    self.process_scrape_result(s, result)
  File "/src/hunter.py", line 95, in process_scrape_result
    self.send_alert(s, result, f'now in stock at {current_price}!')
  File "/src/hunter.py", line 116, in send_alert
    self.alerter(result.alert_subject, result.alert_content)
  File "/src/hunter.py", line 27, in __call__
    with smtplib.SMTP(self.relay) as s:
  File "/usr/local/lib/python3.9/smtplib.py", line 253, in __init__
    (code, msg) = self.connect(host, port)
  File "/usr/local/lib/python3.9/smtplib.py", line 339, in connect
    self.sock = self._get_socket(host, port, self.timeout)
  File "/usr/local/lib/python3.9/smtplib.py", line 310, in _get_socket
    return socket.create_connection((host, port), timeout,
  File "/usr/local/lib/python3.9/socket.py", line 822, in create_connection
    for res in getaddrinfo(host, port, 0, SOCK_STREAM):
  File "/usr/local/lib/python3.9/socket.py", line 953, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):

I tried looking at the source code and I couldn't figure out if it was a compatibility problem or it I was missing something, but my python knowledge is very limited. Any help would be appreciated.

[Question] Windows friendly tutorial for noobs?

Hello Eric, Thank you for this awesome script. I'm attempting to run it on my windows computer. I managed to get docker and git working in powershell. Got as far as step 3 but I keep getting the following errors.

PS C:\Users\steve\inventory-hunter> docker run docker_run.bash -c "config.yaml" -e "[email protected]" -r 192.168.0.1
Unable to find image 'docker_run.bash:latest' locally
docker: Error response from daemon: pull access denied for docker_run.bash, repository does not exist or may require 'docker login': denied: requested access to the resource is denied.
See 'docker run --help'.
PS C:\Users\steve\inventory-hunter> docker run inventory-hunter                                                         usage: run.py [-h] [-c CONFIG] [-e EMAIL [EMAIL ...]] -r RELAY [-v]
run.py: error: argument -c/--config: can't open 'config.yaml': [Errno 2] No such file or directory: 'config.yaml'
PS C:\Users\steve\inventory-hunter>

Obviously, I don't have much experience running python scripts. What am I doing wrong?

[QUESTION] Issues getting this to work.

I want to preface this by saying I am very new to working with raspberrypis. I set up the email system on the same pi as the inventory-hunter and have verified that the emails are working. Everything seems to be working fine and based on another question you answered I set the relay as the pi's local ip address 10.0.0.131. I am assuming the email is the one I want the notifications set to and I specified the config file as "bhphoto_rtx_3070.yaml". Every time I try to run the docket_run.bash command it says that the config file does not exist or is not a regular file. If you can help me I would really appreciate it. This was another cool project to try on my PI4 and help a friend get his hands on the almighty 3070.

Refresh Not Working at Specified Rate

Hi Eric, Found your code online and it is awesome! I've got a bunch of config files that seem to be working. The only thing I'm noticing is that it seems like there is a 90 second refresh between these checking to see if they are in stock? I'm attaching a screenshot of my docker logs. Not a big issue just curious if you know of a quick fix.

P.S. I do have the refresh interval set to 2 as you do in your examples.
image

Container status immediately exiting

I've been trying to figure this out for a while but I can't understand why my container statuses are exiting right after I create them. I run the docker logs [ID] and I get an error run.py: error: the following arguments are required: -a/--alerter
I'm using my email as an alerter type.

Failing to run on 3rd step

on the third step when i go to run i get this error. Any ideas?

./docker_run.bash -c /home/pi/inventory-hunter/config/newegg_rtx_3080.yaml -e [email protected] -r 192.168.86.1
the inventory-hunter docker image does not exist... please build the image and try again
build command: docker build -t inventory-hunter .

[QUESTION] Docker image build

Hello I have problems when executing the .bash it appears "the inventory-hunter docker image does not exist ... please build the image and try again" but I already made the image and I don't know what to move it, it would be very kind of you if you supported me maybe I did something wrong but I follow all the steps you have

Docker Set Up Step #3 and overall process to set up

Every time I get to Step #3 of docker set up it doesnt work. Says user doesn't exist. As a side note I have a fresh install of raspberry pi os installed and doing everything from my 'pi' account. Please check my steps where I am going wrong:

All done as logged into 'pi'

  1. Install Raspberry pi OS
  2. Boot up Raspberry Pi and update to latest OS software
  3. Install Docker and updates/upgrades. (Stuck on #3 as mentioned above)
  4. git clone of inventory-hunter (asking for login credentials)
  5. Tried SMTP to no avail and also discord. Is discord as easy as just using the webhook url, or do I need to do something else with it. In the instructions it is stating in the example to set up a json and adding an extension to the url. I am not sure I need to go this far.

Sorry for maybe the dumb questions, but I have built numerous RetroPi's and have little experience in this set up and been trying feverishly for 2 months to buy 1 simple 3080 gpu and beyond annoyed that it has to come down to me learning coding to buy a gpu. I am willing to do it, but just need some help as I am struggling. Thanks in advance!

one config for multiple websites

Is it possible to create one config file that polls multiple sites, meaning i can have links for specific cards at different websites, such as the asus strix 3080 on newegg, bh, microcenter, etc?

Updating code in docker or rebuild?

Is this or can this be setup so that docker can pull image updates without having to rebuild the container each time? I'm only a bit familiar with this so right now I just hit it with a big hammer by re-cloning the git repo and rebuilding the docker image. Not exactly the fastest or most efficient method. If you have ideas or want to update the how-to section with how-to-maintain it would be really appreciated.

Keep up the great work and thanks for helping to beat the scalpers! No way I'm paying +50%-stupid% markups.

Thanks!

Simple suggestion + tips for noobs (like me)

One thing that might be helpful is to have the docker_run.sh name the container it creates the same as the config file. If I knew how to work Git worth a damn, I'd submit a pull req. for this since I assume it's easy.

Since I was usually creating multiple containers, one for each site, it was then difficult to tell which was for which site.

CONTAINER ID        IMAGE               COMMAND                  CREATED             STATUS              PORTS               NAMES
b4ea0984ef2e        inventory-hunter    "python /src/run.py …"   3 minutes ago       Up 3 minutes                            bold_solomon
265636d8ccdb        inventory-hunter    "python /src/run.py …"   20 minutes ago      Up 20 minutes                           sleepy_golick
d6c9da1a5081        inventory-hunter    "python /src/run.py …"   20 minutes ago      Up 20 minutes                           blissful_tharp

Tip for noobs - to display running containers: docker ps

How do I tell if it's working?
After running docker ps copy the name you want to see if it's working and run docker logs -f <name>

Example from the command above: docker logs -f bold_solomon

Output - this is what i see when it's working. If it's not working, you'll find hints here as to why not.

I2020-11-28 14:30:11,123 gigabyte-geforce-rtx-2070-super-gv-n207swf3oc-8gd: still in stock at the same price
I2020-11-28 14:30:13,143 gigabyte-geforce-rtx-3070-gv-n3070aorus-m-8gd: not in stock
I2020-11-28 14:30:15,113 gigabyte-geforce-rtx-3070-gv-n3070eagle-8gd: not in stock
I2020-11-28 14:30:17,111 gigabyte-geforce-rtx-3070-gv-n3070eagle-oc-8gd: not in stock
I2020-11-28 14:30:19,117 gigabyte-geforce-rtx-3070-gv-n3070gaming-oc-8gd: not in stock
I2020-11-28 14:30:21,094 gigabyte-geforce-rtx-3070-gv-n3070vision-oc-8gd: not in stock

amazon_rtx_3070 container process forks

Had an issue last night where my docker host stopped allowing new processes. After some digging, it seems like the amazon_rtx_3070 container may be responsible for this. Compared to the others, it's forking a ton of processes. Has anyone else seen this?

This screenshot was taken a few minutes after launch
$ docker stats
Screenshot from 2020-12-03 10-52-37

I'm using the default config/amazon_rtx_3070.yaml from the latest commit

Here's the docker logs output for the container. Not sure if the missing title entries could be related to this or not.

$ docker logs -f amazon_rtx_3070
D2020-12-03 17:25:48,107 starting with args: /src/run.py --alerter email --email <redacted> --relay <redacted>
D2020-12-03 17:25:48,394 using parser: lxml
D2020-12-03 17:25:48,394 registering custom scraper for domain: amazon
D2020-12-03 17:25:48,395 registering custom scraper for domain: bestbuy
D2020-12-03 17:25:48,396 registering custom scraper for domain: bhphotovideo
D2020-12-03 17:25:48,397 registering custom scraper for domain: microcenter
D2020-12-03 17:25:48,399 registering custom scraper for domain: newegg
D2020-12-03 17:25:48,400 registering custom scraper for domain: walmart
I2020-12-03 17:25:48,419 scraper initialized for https://www.amazon.com/dp/B08HBF5L3K
I2020-12-03 17:25:48,420 scraper initialized for https://www.amazon.com/dp/B08HBJB7YD
I2020-12-03 17:25:48,420 scraper initialized for https://www.amazon.com/dp/B08KWLMZV4
I2020-12-03 17:25:48,420 scraper initialized for https://www.amazon.com/dp/B08KWN2LZG
I2020-12-03 17:25:48,420 scraper initialized for https://www.amazon.com/dp/B08KWPDXJZ
I2020-12-03 17:25:48,420 scraper initialized for https://www.amazon.com/dp/B08KXZV626
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08KY266MG
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08KY322TH
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08L8HPKR6
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08L8JNTXQ
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08L8KC1J7
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08L8L71SM
I2020-12-03 17:25:48,421 scraper initialized for https://www.amazon.com/dp/B08L8L9TCZ
I2020-12-03 17:25:48,422 scraper initialized for https://www.amazon.com/dp/B08L8LG4M3
I2020-12-03 17:25:48,422 scraper initialized for https://www.amazon.com/dp/B08LF1CWT2
I2020-12-03 17:25:48,422 scraper initialized for https://www.amazon.com/dp/B08LF32LJ6
I2020-12-03 17:25:48,422 scraper initialized for https://www.amazon.com/dp/B08LW46GH2
W2020-12-03 17:25:50,424 warning: using selenium webdriver for scraping... this feature is under active development
W2020-12-03 17:25:52,334 missing title: https://www.amazon.com/dp/B08HBF5L3K
I2020-12-03 17:25:52,337 B08HBF5L3K: not in stock
W2020-12-03 17:25:54,313 missing title: https://www.amazon.com/dp/B08HBJB7YD
I2020-12-03 17:25:54,315 B08HBJB7YD: not in stock
W2020-12-03 17:25:56,306 missing title: https://www.amazon.com/dp/B08KWLMZV4
I2020-12-03 17:25:56,308 B08KWLMZV4: not in stock
W2020-12-03 17:25:58,305 missing title: https://www.amazon.com/dp/B08KWN2LZG
I2020-12-03 17:25:58,309 B08KWN2LZG: not in stock
W2020-12-03 17:26:00,323 missing title: https://www.amazon.com/dp/B08KWPDXJZ
I2020-12-03 17:26:00,326 B08KWPDXJZ: not in stock
W2020-12-03 17:26:02,326 missing title: https://www.amazon.com/dp/B08KXZV626
I2020-12-03 17:26:02,329 B08KXZV626: not in stock
W2020-12-03 17:26:04,314 missing title: https://www.amazon.com/dp/B08KY266MG
I2020-12-03 17:26:04,316 B08KY266MG: not in stock
W2020-12-03 17:26:06,377 missing title: https://www.amazon.com/dp/B08KY322TH
I2020-12-03 17:26:06,380 B08KY322TH: not in stock
W2020-12-03 17:26:08,337 missing title: https://www.amazon.com/dp/B08L8HPKR6
I2020-12-03 17:26:08,340 B08L8HPKR6: not in stock
W2020-12-03 17:26:10,328 missing title: https://www.amazon.com/dp/B08L8JNTXQ
I2020-12-03 17:26:10,330 B08L8JNTXQ: not in stock
W2020-12-03 17:26:12,314 missing title: https://www.amazon.com/dp/B08L8KC1J7
I2020-12-03 17:26:12,316 B08L8KC1J7: not in stock
W2020-12-03 17:26:14,316 missing title: https://www.amazon.com/dp/B08L8L71SM
I2020-12-03 17:26:14,318 B08L8L71SM: not in stock
W2020-12-03 17:26:16,330 missing title: https://www.amazon.com/dp/B08L8L9TCZ
I2020-12-03 17:26:16,333 B08L8L9TCZ: not in stock
W2020-12-03 17:26:18,308 missing title: https://www.amazon.com/dp/B08L8LG4M3
I2020-12-03 17:26:18,311 B08L8LG4M3: not in stock
W2020-12-03 17:26:20,305 missing title: https://www.amazon.com/dp/B08LF1CWT2
I2020-12-03 17:26:20,308 B08LF1CWT2: not in stock
W2020-12-03 17:26:22,324 missing title: https://www.amazon.com/dp/B08LF32LJ6
I2020-12-03 17:26:22,327 B08LF32LJ6: not in stock
W2020-12-03 17:26:24,348 missing title: https://www.amazon.com/dp/B08LW46GH2
I2020-12-03 17:26:24,350 B08LW46GH2: not in stock
W2020-12-03 17:26:26,337 missing title: https://www.amazon.com/dp/B08HBF5L3K
I2020-12-03 17:26:26,340 B08HBF5L3K: not in stock
W2020-12-03 17:26:28,378 missing title: https://www.amazon.com/dp/B08HBJB7YD
I2020-12-03 17:26:28,380 B08HBJB7YD: not in stock
W2020-12-03 17:26:30,348 missing title: https://www.amazon.com/dp/B08KWLMZV4
I2020-12-03 17:26:30,350 B08KWLMZV4: not in stock
W2020-12-03 17:26:32,331 missing title: https://www.amazon.com/dp/B08KWN2LZG
I2020-12-03 17:26:32,334 B08KWN2LZG: not in stock
W2020-12-03 17:26:34,345 missing title: https://www.amazon.com/dp/B08KWPDXJZ
I2020-12-03 17:26:34,347 B08KWPDXJZ: not in stock
W2020-12-03 17:26:36,374 missing title: https://www.amazon.com/dp/B08KXZV626
I2020-12-03 17:26:36,376 B08KXZV626: not in stock
W2020-12-03 17:26:38,332 missing title: https://www.amazon.com/dp/B08KY266MG
I2020-12-03 17:26:38,335 B08KY266MG: not in stock

help euro € price problem amazon letters: & n b s p ;

I read about change Dockerfile but with euro there is a problem on amazon France Italy Germany and all the country with euro€value:
The price contain strange characters 19,99 €
example:
<span id="price_inside_buybox" class="a-size-medium a-color-price"> 19,99&nbsp;€ </span>

And the program tells me that it's not able to convert in float the string of the price.

I edited en_US.UTF-8 to it_IT.UTF-8 or fr or gr but it didn't work.

Someone managed to make it work with € ?
Thanks

Blocked from NewEgg.com

I have a error when accessing Newegg.com. We're sorry, but our systems have detected the possible use of an automated program to visit Newegg.com

Any solution please

error when building docker

Good Morning,

when I am building the docker image I get the following error:

Collecting urllib3<1.27,>=1.21.1
Downloading urllib3-1.26.2-py2.py3-none-any.whl (136 kB)
ERROR: THESE PACKAGES DO NOT MATCH THE HASHES FROM THE REQUIREMENTS FILE. If you have updated the package versions, please update the hashes. Otherwise, examine the package contents carefully; someone may have tampered with them.
lxml from https://files.pythonhosted.org/packages/8e/9c/74248ebfd8893bc8ae87abdcc15dd195d01a737ec6e5f91fa503f667e1b9/lxml-4.6.2-cp39-cp39-manylinux1_x86_64.whl#sha256=7e9eac1e526386df7c70ef253b792a0a12dd86d833b1d329e038c7a235dfceb5 (from -r /src/requirements.txt (line 2)):
Expected sha256 7e9eac1e526386df7c70ef253b792a0a12dd86d833b1d329e038c7a235dfceb5
Got 5e153b711867e66826c7354753674e4fe4baae5c58dd20681fe4029f004e28fb

urllib3<1.27,>=1.21.1 from https://files.pythonhosted.org/packages/f5/71/45d36a8df68f3ebb098d6861b2c017f3d094538c0fb98fa61d4dc43e69b9/urllib3-1.26.2-py2.py3-none-any.whl#sha256=d8ff90d979214d7b4f8ce956e80f4028fc6860e4431f731ea4a8c08f23f99473 (from requests->-r /src/requirements.txt (line 4)):
    Expected sha256 d8ff90d979214d7b4f8ce956e80f4028fc6860e4431f731ea4a8c08f23f99473
         Got        69d396d850e63a8aea3d4b22a3ae6197b34574e6264ba8b08cff41f37f0ebc8e

WARNING: You are using pip version 20.2.4; however, version 20.3 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
The command '/bin/sh -c pip install -r /src/requirements.txt' returned a non-zero code: 1
[root@localhost inventory-hunter]#

[Question] Under what circumstance does it send an email?

(I'm new to coding like this, so sorry if this isn't clear)
I started two docker containers successfully, and when I run the commands to show the logs they are accurately telling me the status of the item. As suggested in another question, I added a product that was in stock for a test email, and it says it's in stock at the same price but I haven't received an email for even the first run, so I wanted to know what event triggers an email. I followed the readme all the way through and typed the email and my ip correctly.
In case this is relevant: I'm not using raspberrypi, I did the wsl2 thing and am typing in ubuntu from the microsoft store

Edit: forgot to include this question, if we're looking for different items (Example: I'm searching for the 3070 at best buy but the only config file in the folder was for the 3080) I would replace the 3080 link with the 3070 link from best buy, or are there different steps to follow?

when trying to build wheel for lxml (setup.py) i get an error

Running setup.py clean for lxml
  ERROR: Command errored out with exit status 1:
   command: /usr/local/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-m01b78xu
       cwd: /tmp/pip-install-kdw1mm6q/lxml/
  Complete output (84 lines):
  Building lxml version 4.6.2.
  Building without Cython.
  Building against libxml2 2.9.4 and libxslt 1.1.32
  running bdist_wheel
  running build
  running build_py
  creating build
  creating build/lib.linux-armv7l-3.9
  creating build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/ElementInclude.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/cssselect.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/__init__.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/builder.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/doctestcompare.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/_elementpath.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/pyclasslookup.py -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/sax.py -> build/lib.linux-armv7l-3.9/lxml
  creating build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/__init__.py -> build/lib.linux-armv7l-3.9/lxml/includes
  creating build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/__init__.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/formfill.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/soupparser.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/builder.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/_diffcommand.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/_html5builder.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/ElementSoup.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/clean.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/_setmixin.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/diff.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/html5parser.py -> build/lib.linux-armv7l-3.9/lxml/html
  copying src/lxml/html/defs.py -> build/lib.linux-armv7l-3.9/lxml/html
  creating build/lib.linux-armv7l-3.9/lxml/isoschematron
  copying src/lxml/isoschematron/__init__.py -> build/lib.linux-armv7l-3.9/lxml/isoschematron
  copying src/lxml/etree.h -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/etree_api.h -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/lxml.etree.h -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/lxml.etree_api.h -> build/lib.linux-armv7l-3.9/lxml
  copying src/lxml/includes/relaxng.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/tree.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xinclude.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/config.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xslt.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/c14n.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/uri.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/schematron.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/__init__.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/xpath.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/etree_defs.h -> build/lib.linux-armv7l-3.9/lxml/includes
  copying src/lxml/includes/lxml-version.h -> build/lib.linux-armv7l-3.9/lxml/includes
  creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources
  creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
  copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
  creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
  copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
  copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
  creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
  running build_ext
  building 'lxml.etree' extension
  creating build/temp.linux-armv7l-3.9
  creating build/temp.linux-armv7l-3.9/src
  creating build/temp.linux-armv7l-3.9/src/lxml
  gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DCYTHON_CLINE_IN_TRACEBACK=0 -I/usr/include/libxml2 -Isrc -Isrc/lxml/includes -I/usr/local/include/python3.9 -c src/lxml/etree.c -o build/temp.linux-armv7l-3.9/src/lxml/etree.o -w
  gcc: fatal error: Killed signal terminated program as
  compilation terminated.
  Compile failed: command '/usr/bin/gcc' failed with exit code 1
  creating tmp
  cc -I/usr/include/libxml2 -I/usr/include/libxml2 -c /tmp/xmlXPathInit6i1uyksx.c -o tmp/xmlXPathInit6i1uyksx.o
  cc tmp/xmlXPathInit6i1uyksx.o -lxml2 -o a.out
  error: command '/usr/bin/gcc' failed with exit code 1
  ----------------------------------------
  ERROR: Failed building wheel for lxml

and this error

  Running setup.py install for lxml: started
    Running setup.py install for lxml: still running...
    Running setup.py install for lxml: finished with status 'error'
    ERROR: Command errored out with exit status 1:
     command: /usr/local/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-lj2i0for/install-record.txt --single-version-externally-managed --compile --install-headers /usr/local/include/python3.9/lxml
         cwd: /tmp/pip-install-kdw1mm6q/lxml/
    Complete output (83 lines):
    Building lxml version 4.6.2.
    Building without Cython.
    Building against libxml2 2.9.4 and libxslt 1.1.32
    running install
    running build
    running build_py
    creating build
    creating build/lib.linux-armv7l-3.9
    creating build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/ElementInclude.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/cssselect.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/__init__.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/builder.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/doctestcompare.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/_elementpath.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/pyclasslookup.py -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/sax.py -> build/lib.linux-armv7l-3.9/lxml
    creating build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/__init__.py -> build/lib.linux-armv7l-3.9/lxml/includes
    creating build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/__init__.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/formfill.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/soupparser.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/builder.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/_diffcommand.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/_html5builder.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/ElementSoup.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/clean.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/_setmixin.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/diff.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/html5parser.py -> build/lib.linux-armv7l-3.9/lxml/html
    copying src/lxml/html/defs.py -> build/lib.linux-armv7l-3.9/lxml/html
    creating build/lib.linux-armv7l-3.9/lxml/isoschematron
    copying src/lxml/isoschematron/__init__.py -> build/lib.linux-armv7l-3.9/lxml/isoschematron
    copying src/lxml/etree.h -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/etree_api.h -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/lxml.etree.h -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/lxml.etree_api.h -> build/lib.linux-armv7l-3.9/lxml
    copying src/lxml/includes/relaxng.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/tree.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xinclude.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/config.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xslt.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/c14n.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/uri.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/schematron.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/__init__.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/xpath.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/etree_defs.h -> build/lib.linux-armv7l-3.9/lxml/includes
    copying src/lxml/includes/lxml-version.h -> build/lib.linux-armv7l-3.9/lxml/includes
    creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources
    creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
    copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
    creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
    copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
    copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
    creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
    running build_ext
    building 'lxml.etree' extension
    creating build/temp.linux-armv7l-3.9
    creating build/temp.linux-armv7l-3.9/src
    creating build/temp.linux-armv7l-3.9/src/lxml
    gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DCYTHON_CLINE_IN_TRACEBACK=0 -I/usr/include/libxml2 -Isrc -Isrc/lxml/includes -I/usr/local/include/python3.9 -c src/lxml/etree.c -o build/temp.linux-armv7l-3.9/src/lxml/etree.o -w
    gcc: fatal error: Killed signal terminated program as
    compilation terminated.
    Compile failed: command '/usr/bin/gcc' failed with exit code 1
    cc -I/usr/include/libxml2 -I/usr/include/libxml2 -c /tmp/xmlXPathInit7r1hfboz.c -o tmp/xmlXPathInit7r1hfboz.o
    cc tmp/xmlXPathInit7r1hfboz.o -lxml2 -o a.out
    error: command '/usr/bin/gcc' failed with exit code 1
    ----------------------------------------
ERROR: Command errored out with exit status 1: /usr/local/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"'; __file__='"'"'/tmp/pip-install-kdw1mm6q/lxml/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(__file__);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, __file__, '"'"'exec'"'"'))' install --record /tmp/pip-record-lj2i0for/install-record.txt --single-version-externally-managed --compile --install-headers /usr/local/include/python3.9/lxml Check the logs for full command output.
WARNING: You are using pip version 20.2.4; however, version 20.3 is available.
You should consider upgrading via the '/usr/local/bin/python -m pip install --upgrade pip' command.
The command '/bin/sh -c pip install -r /src/requirements.txt' returned a non-zero code: 1

Startup script for the container doesnt recognize email flag

Hi,

I am passing both the -e [email protected] and -r <relay ip> flags when running the command to kick off the startup script but I am met with the following error:

error: missing email argument usage: ./docker_run.bash -c CONFIG -e EMAIL -r RELAY

Any ideas as to why it wont recognize the -e flag properly?

Here's the entire command im running:

./docker_run.bash -c ./config/*.yaml -e [email protected] -r 172.17.0.1

Not working with Uk amazon

When using this with uk amazon it says:
unable to convert "£140.45" to float... caught exception: could not convert string to float: '£140.45'
however it works with US amazin

Consider configs for processors

More a feature request, but with the shortage of new Ryzen processors, thought you might want to add some configs for those too.

Multiple email addresses

How does one input multiple emails when running the line of code shown in readme file ? (If possible)

SMTP Relay Directory

Should I be setting up the SMTP relay in the Base Directory or within the hunter-inventory directory after installing Docker. Sorry new to this and learning. Thanks!

Cannot build docker image.

Alr hey guys. Im new to coding and idk how to do a lot of these things lol. Pls bear w me. Btw thanks Eric for making this its much appreciated. Oh sorry my english is bad too. When building the image it goes to step 9/11 and then says this:

pi@raspberrypi:~/inventory-hunter $ docker build -t inventory-hunter .
Sending build context to Docker daemon 125.4kB
Step 1/11 : FROM python:3.9
---> 1e73204649e2
Step 2/11 : RUN apt update
---> Using cache
---> 059ec8ea530b
Step 3/11 : RUN apt install -y chromium chromium-driver locales locales-all
---> Using cache
---> c26fc144649c
Step 4/11 : ENV LC_ALL en_US.UTF-8
---> Using cache
---> 836462816cda
Step 5/11 : ENV LANG en_US.UTF-8
---> Using cache
---> 3c4e49abe3c4
Step 6/11 : ENV LANGUAGE en_US.UTF-8
---> Using cache
---> 1a77b4929c62
Step 7/11 : WORKDIR /
---> Using cache
---> 85ce7862d489
Step 8/11 : COPY requirements.txt /src/requirements.txt
---> Using cache
---> 407463daf530
Step 9/11 : RUN pip install -r /src/requirements.txt
---> Running in e11994b8381a
Collecting beautifulsoup4
Downloading beautifulsoup4-4.9.3-py3-none-any.whl (115 kB)
Collecting lxml
Downloading lxml-4.6.2.tar.gz (3.2 MB)
Collecting pyyaml
Downloading PyYAML-5.3.1.tar.gz (269 kB)
Collecting requests
Downloading requests-2.25.0-py2.py3-none-any.whl (61 kB)
Collecting selenium
Downloading selenium-3.141.0-py2.py3-none-any.whl (904 kB)
Collecting soupsieve>1.2; python_version >= "3.0"
Downloading soupsieve-2.0.1-py3-none-any.whl (32 kB)
Collecting chardet<4,>=3.0.2
Downloading chardet-3.0.4-py2.py3-none-any.whl (133 kB)
Collecting urllib3<1.27,>=1.21.1
Downloading urllib3-1.26.2-py2.py3-none-any.whl (136 kB)
Collecting certifi>=2017.4.17
Downloading certifi-2020.11.8-py2.py3-none-any.whl (155 kB)
Collecting idna<3,>=2.5
Downloading idna-2.10-py2.py3-none-any.whl (58 kB)
Building wheels for collected packages: lxml, pyyaml
Building wheel for lxml (setup.py): started
Building wheel for lxml (setup.py): still running...
Building wheel for lxml (setup.py): finished with status 'error'
ERROR: Command errored out with exit status 1:
command: /usr/local/bin/python -u -c 'import sys, setuptools, tokenize; sys.argv[0] = '"'"'/tmp/pip-install-mh277zjg/lxml/setup.py'"'"'; file='"'"'/tmp/pip-install-mh277zjg/lxml/setup.py'"'"';f=getattr(tokenize, '"'"'open'"'"', open)(file);code=f.read().replace('"'"'\r\n'"'"', '"'"'\n'"'"');f.close();exec(compile(code, file, '"'"'exec'"'"'))' bdist_wheel -d /tmp/pip-wheel-tbihpvsi
cwd: /tmp/pip-install-mh277zjg/lxml/
Complete output (84 lines):
Building lxml version 4.6.2.
Building without Cython.
Building against libxml2 2.9.4 and libxslt 1.1.32
running bdist_wheel
running build
running build_py
creating build
creating build/lib.linux-armv7l-3.9
creating build/lib.linux-armv7l-3.9/lxml
copying src/lxml/ElementInclude.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/cssselect.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/init.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/builder.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/doctestcompare.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/_elementpath.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/pyclasslookup.py -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/sax.py -> build/lib.linux-armv7l-3.9/lxml
creating build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/init.py -> build/lib.linux-armv7l-3.9/lxml/includes
creating build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/usedoctest.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/init.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/formfill.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/soupparser.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/builder.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/_diffcommand.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/_html5builder.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/ElementSoup.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/clean.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/_setmixin.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/diff.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/html5parser.py -> build/lib.linux-armv7l-3.9/lxml/html
copying src/lxml/html/defs.py -> build/lib.linux-armv7l-3.9/lxml/html
creating build/lib.linux-armv7l-3.9/lxml/isoschematron
copying src/lxml/isoschematron/init.py -> build/lib.linux-armv7l-3.9/lxml/isoschematron
copying src/lxml/etree.h -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/etree_api.h -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/lxml.etree.h -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/lxml.etree_api.h -> build/lib.linux-armv7l-3.9/lxml
copying src/lxml/includes/relaxng.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/etreepublic.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xmlerror.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/tree.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xinclude.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/config.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/htmlparser.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xslt.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/c14n.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/uri.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/schematron.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/dtdvalid.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xmlschema.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/init.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/xpath.pxd -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/etree_defs.h -> build/lib.linux-armv7l-3.9/lxml/includes
copying src/lxml/includes/lxml-version.h -> build/lib.linux-armv7l-3.9/lxml/includes
creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources
creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
copying src/lxml/isoschematron/resources/rng/iso-schematron.rng -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/rng
creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
copying src/lxml/isoschematron/resources/xsl/XSD2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
copying src/lxml/isoschematron/resources/xsl/RNG2Schtrn.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl
creating build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_dsdl_include.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_skeleton_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_svrl_for_xslt1.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_abstract_expand.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/iso_schematron_message.xsl -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
copying src/lxml/isoschematron/resources/xsl/iso-schematron-xslt1/readme.txt -> build/lib.linux-armv7l-3.9/lxml/isoschematron/resources/xsl/iso-schematron-xslt1
running build_ext
building 'lxml.etree' extension
creating build/temp.linux-armv7l-3.9
creating build/temp.linux-armv7l-3.9/src
creating build/temp.linux-armv7l-3.9/src/lxml
gcc -pthread -Wno-unused-result -Wsign-compare -DNDEBUG -g -fwrapv -O3 -Wall -fPIC -DCYTHON_CLINE_IN_TRACEBACK=0 -I/usr/include/libxml2 -Isrc -Isrc/lxml/includes -I/usr/local/include/python3.9 -c src/lxml/etree.c -o build/temp.linux-armv7l-3.9/src/lxml/etree.o -w
gcc: fatal error: Killed signal terminated program as
compilation terminated.
Compile failed: command '/usr/bin/gcc' failed with exit code 1
creating tmp
cc -I/usr/include/libxml2 -I/usr/include/libxml2 -c /tmp/xmlXPathInit_he3bsco.c -o tmp/xmlXPathInit_he3bsco.o
cc tmp/xmlXPathInit_he3bsco.o -lxml2 -o a.out
error: command '/usr/bin/gcc' failed with exit code 1

ERROR: Failed building wheel for lxml
Running setup.py clean for lxml
Building wheel for pyyaml (setup.py): started
Building wheel for pyyaml (setup.py): still running...
Building wheel for pyyaml (setup.py): finished with status 'done'
Created wheel for pyyaml: filename=PyYAML-5.3.1-cp39-cp39-linux_armv7l.whl size=514839 sha256=c804dc7fbd8263026447af4b8be22e6e6d3c3db5831efcc487b996c5a42880ad
Stored in directory: /root/.cache/pip/wheels/69/60/81/5cd74b8ee068fbe9e04ca0d53148f28f5c6e2c5b177d5dd622
Successfully built pyyaml
Failed to build lxml
Installing collected packages: soupsieve, beautifulsoup4, lxml, pyyaml, chardet, urllib3, certifi, idna, requests, selenium
Running setup.py install for lxml: started

I did this 3 times and it says no docker image file found when doing step 3

Docker image list:
docker image list
REPOSITORY TAG IMAGE ID CREATED SIZE
407463daf530 4 hours ago 1.3GB
python 3.9 1e73204649e2 11 days ago 694MB
hello-world latest 851163c78e4a 11 months ago 4.85kB

pls help me to find out what is going on or what im missing.

oh btw docker is installed correctly i checked already.

thanks guys

[QUESTIONS] Config Refresh, Email Resending

Thanks for making this! A few questions:

  1. Once the config file has been loaded does the inventory-hunter need to be reloaded if a new URL is added? So far my testing says it does.

  2. If I get an email about an in stock product what happens next? Will I get another email next check for the same product?

  3. If I get an email about an in stock product, then awhile later another product goes in stock will I get an email about the new one?

[Question] Is there a way to test?

Hello,
I am fairly new to this sort of code. I was wondering if I could somehow force this program to send me a notification to make sure that it is working correctly? Also, I am looking forward to BestBuy support.

NewEgg Ban

I ran the script with the NewEgg file configurations and was eventually banned from visiting NewEgg's site for using an automatic process. Not sure if anything can be done for this programmatically but maybe worth mentioning in the ReadMe.

image

[QUESTION] Does it work with Amazon or Bestbuy?

Thanks for coding this! I'm looking to get an RTX 3070 for my first gaming PC build.
You listed 3 configs here for Newegg, B & H, and MicroCenter. Would it be possible to configure this to work with BestBuy or Amazon?
Thanks!

EDIT: I just saw that you are working on best buy!
Thank you! (I think that the founders edition looks the coolest by far)

How make it work in spanish sites

Hello, how can i make it work in spanish sites? should i change "add to cart" to "agregar al carrito", change language?

i was able to make it work normally on windows

Docker pull function not working

Installing on a laptop running raspberry pi os (LOL), since my raspberry pi 1 model a decided not to work.
I am unable to run docker pull to finish installing.

Input: docker pull ericjmarti/inventory-hunter:latest
output: latest: Pulling from ericjmarti/inventory-hunter
no matching manifest for unknown in the manifest list entries

Any suggestions?

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.