googlecolab / jupyter_http_over_ws Goto Github PK
View Code? Open in Web Editor NEWLicense: Apache License 2.0
License: Apache License 2.0
A few hours ago I was able to connect my Colab notebook to my local runtime using jupyter. I am now suddenly unable to despite reinstalling jupyter_https_over_ws, and running --NotebookApp.allow_origin, -- no-browser and a variety of other flags.
Hi, I am a bit disorientated, I am using h5py to read datasets from a hierarchical multi group h5 file
First I implemented a NN using jupyter-notebook on google colab, then I decided to move most of the stuff inside custom libraries and only run a command with arguments from jupyter notebook in colab but the run was much slower!
In both cases, the library reading the datasets on the hdf5 files are custom libraries (.py files called from jupyter)
The only difference is that in one case I loop through each epoch and batch from jupyter, in the new (and much slower implementation) I call custom libraries for looping through epochs and batches, that’s all.
In the last case, the code is much slower. I tracked carefully the origin of the delay and it seems to be when reading and managing datasets in h5py
For instance, the following code
def _get_signal_window(self, with_labels=False):
if (self.get_number_of_avail_windows() == 0):
self._reset_random_wind()
sample = self._get_sample()
Cnp = sample[0]
Duration = sample[1]
Dnp = sample[2]
window_number = sample[3]
# >>>>>>>>>>>>> HERE IS THE DIFFERENCE
dset = self.File['Cnp_' + str(Cnp+1) + '/Duration_' + str(Duration+1) + '/Dnp_' + str(Dnp+1) + '/data']
assert dset.shape[1] % self.length == 0
samples_per_second = int(dset.shape[1] / self.length)
samples_per_window = int(samples_per_second * self.window)
begin = window_number * samples_per_window
end = begin + samples_per_window
time_window = torch.Tensor(dset[0,begin:end]).to(self.device)
clean_signal = torch.Tensor(dset[1,begin:end]).to(self.device)
noisy_signal = torch.Tensor(dset[2,begin:end]).to(self.device)
if with_labels:
starts, widths, amplitudes, categories, number_of_pulses, average_width, average_amplitude = self._get_labels(time_window, Cnp, Duration, Dnp)
# >>>>>>>>>>>>> HERE IS THE DIFFERENCE
return time_window, clean_signal, noisy_signal, starts, widths, amplitudes, categories, number_of_pulses, average_width, average_amplitude
else:
return time_window, clean_signal, noisy_signal
runs at least 6 times faster when all the code in in jupyter than when I use custom libraries (remember this code above is in a custom library in both cases)
The code between the tags is the one which makes the difference and is identical in both cases
I can share repo link and jupyter files too
What is happening here?
Thanks!!!
I apologize in advance for the vague issue, but I can't seem to change the jupyter kernel used by my local runtime, I've tried setting the default kernel in the config, and running jupyter from within a conda virtual environment, if anyone's aware of a way to do this, it would be greatly appreciated.
'Tornado' released '6.0.0' on 01/03/2019, making 'jupyter_http_over_ws' raising an ImportError when invoking:
jupyter serverextension enable --py jupyter_http_over_ws
https://github.com/tornadoweb/tornado/issues/2383
File "<site-packages\jupyter_http_over_ws\handlers.py", line 28, in <module>
from tornado import stack_context
ImportError: cannot import name 'stack_context'
Using tornado==5.1 makes it work.
How I can install this package in anaconda and using that in anaconda envs?
If connecting to a local runtime via port forwarding and the port # is forwarded to a different port then the proxy will fail to connect and will generate the error:
[E 21:14:40.883 NotebookApp] Couldn't attach auth cookies
[E 21:14:59.596 NotebookApp] Uncaught error when proxying request
Traceback (most recent call last):
File "/opt/conda/lib/python3.8/site-packages/tornado/tcpclient.py", line 138, in on_connect_done
stream = future.result()
tornado.iostream.StreamClosedError: Stream is closed
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/conda/lib/python3.8/site-packages/jupyter_http_over_ws/handlers.py", line 179, in _attach_auth_cookies
extra_cookies = yield _perform_request_and_extract_cookies(
File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
value = future.result()
File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 769, in run
yielded = self.gen.throw(*exc_info) # type: ignore
File "/opt/conda/lib/python3.8/site-packages/jupyter_http_over_ws/handlers.py", line 533, in _perform_request_and_extract_cookies
response = yield http_client.fetch(proxy_request, raise_error=False)
File "/opt/conda/lib/python3.8/site-packages/tornado/gen.py", line 762, in run
value = future.result()
File "/opt/conda/lib/python3.8/site-packages/tornado/iostream.py", line 1205, in connect
self.socket.connect(address)
OSError: [Errno 99] Cannot assign requested address
The connection works if I ensure that the port is not changed during the forwarding.
My issue may be similar to @debsahu on #1.
I followed instructions as described here.
I am attempting to get colaboratory working from a browser on my laptop using a different computer as the local runtime - let's call this computer "local_server", which happens to have an ip address of 10.1.1.3 on my local LAN. The browser that I am using is the chrome (Version 67.0.3396.99 (Official Build) (64-bit))
I now want to work on colaboratory on my laptop using local_server as the local runtime
ssh -L 8889:10.1.1.3:8888 10.1.1.3
jupyter_notebook_config.py
file on the local_server machine I setc.NotebookApp.ip = '10.1.1.3'
Now I can go to http://localhost:8889/tree
on my laptop and jupyter works but colaboratory does not. I get the following error from jupyter:
[E 10:22:43.257 NotebookApp] Uncaught error when proxying request
Traceback (most recent call last):
File "/usr/local/lib/python3.4/dist-packages/jupyter_http_over_ws/handlers.py", line 151, in on_message
response.rethrow()
File "/usr/local/lib/python3.4/dist-packages/tornado/httpclient.py", line 652, in rethrow
raise self.error
ConnectionRefusedError: [Errno 111] Connection refused
What am I doing wrong? Any suggestions would be appreciated. Thanks.
Hi!
I have a problem, my Colab notebook keeps crashing when connected to the local runtime (over two SSH tunnels, if it makes the difference) with the error 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log
. I pasted the cmd output below. Sometimes it happens immediately when I try to run the first cell in the notebook after successful connection to the local runtime and sometimes after some time mid-training (using TF v2). My OS is Ubuntu 18.04.
Do you know this problem? Any ideas on how to fix it?
[I 09:13:30.945 NotebookApp] Jupyter Notebook 6.4.6 is running at:
[I 09:13:30.945 NotebookApp] http://localhost:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:30.945 NotebookApp] or http://127.0.0.1:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:30.945 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip confirmation).
[C 09:13:30.948 NotebookApp]
To access the notebook, open this file in a browser:
file:///home/pjanuszewski/.local/share/jupyter/runtime/nbserver-8792-open.html
Or copy and paste one of these URLs:
http://localhost:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
or http://127.0.0.1:8888/?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:50.656 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.910000ms
[I 09:13:50.865 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.880000ms
[I 09:13:50.905 NotebookApp] Kernel started: f4a78d66-276f-4b34-a18a-88722e9d992e, name: python3
[I 09:13:50.952 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.930000ms
[I 09:13:50.963 NotebookApp] proxying WebSocket connection to: ws://localhost:8888/api/kernels/f4a78d66-276f-4b34-a18a-88722e9d992e/channels?session_id=cad2755a9fd5414ec4dd128190228efc&jupyter_http_over_ws_auth_url=http%3A%2F%2Flocalhost%3A8888%2F%3Ftoken%3De848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa
[I 09:13:54.878 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.900000ms
[I 09:13:56.594 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.840000ms
[I 09:13:57.777 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.810000ms
[I 09:14:11.905 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports
WARNING:root:kernel f4a78d66-276f-4b34-a18a-88722e9d992e restarted
[I 09:14:12.374 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 1.430000ms
[I 09:14:12.394 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.490000ms
[W 09:14:12.401 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log
[W 09:14:12.401 NotebookApp] No such file or directory: var/colab/app.log
[W 09:14:12.401 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1) 0.840000ms referer=None
[I 09:14:12.406 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.350000ms
[W 09:14:12.412 NotebookApp] 404 GET /api/contents/var/colab/ooms (127.0.0.1): No such file or directory: var/colab/ooms
[W 09:14:12.412 NotebookApp] No such file or directory: var/colab/ooms
[W 09:14:12.412 NotebookApp] 404 GET /api/contents/var/colab/ooms (127.0.0.1) 0.590000ms referer=None
[I 09:14:15.560 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.800000ms
[W 09:14:15.573 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1): No such file or directory: var/colab/app.log
[W 09:14:15.574 NotebookApp] No such file or directory: var/colab/app.log
[W 09:14:15.574 NotebookApp] 404 GET /api/contents/var/colab/app.log (127.0.0.1) 1.180000ms referer=None
[I 09:15:18.702 NotebookApp] 302 GET /?token=e848e3b8ff73d61e99bbc4ea2e575ff4157442406978cbfa (127.0.0.1) 0.800000ms
[I 09:15:23.921 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports
Thanks!
Piotr
I installed this extension and now my jupyter won't run anything as it can't connect to the kernel. Please advise on how I can troubleshoot/uninstall this extension. Thanks!
Followed instructions as described here. Jupyter works on localhost without issues. I open the firewall to allow TCP over port 9090 and issued this
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=9090
and tried to reconnect in codelab, it fails and this is what is seen on the jupyter terminal.
[C 16:36:43.917 NotebookApp]
Copy/paste this URL into your browser when you connect for the first time,
to login with a token:
http://localhost:9090/?token=xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx
[I 16:36:44.072 NotebookApp] Accepting one-time-token-authenticated connection from ::1
[W 16:38:04.056 NotebookApp] Forbidden
[W 16:38:12.181 NotebookApp] 403 GET /api/kernelspecs (::1) 8124.95ms referer=None
Any help appreciated!
I can access the jupyter server on my local machine, but when I try to connect to a local runtime on google collab, I get this error:
404 GET /http_over_websocket?min_version=0.0.7&jupyter_http_over_ws_auth_url=http%3A%2F%2Flocalhost%3A8098%2F%3Ftoken%{TOKEN_NOT_INCLUDED} (172.17.0.1) 15.77ms referer=None
I am trying to run a basic code on colab but I am not able to connect to any runtime. It just tries to allocate but never allocates one. It's not just on my chrome but it's the same in all my browsers on my system. I don't what to do this issue. I am not able to load the stadia and the cannot use the google docs as it asks me to reload.
Can anyone help me with this problem?
I cannot connect colab to my local runtime using the instructions provided by google. However, when I turn off my firewall it works perfectly fine. Can you tell my with protocol or port should I allow through my firewall to make the connection? I don't want to turn off my firewall or not use jupyter token due to security threads that doing those poses.
Please excuse me and direct me to another source for information if there is one (couldn't find anything on this)
I am trying to connect to jupyter notebook on google colab's local runtime and when I try to I get this message:
[E 13:51:13.307 NotebookApp] Rejecting connection: Requested version (0.0.1a2) >
Current version (0.0.1a1). Please upgrade this package.
I was running local runtime jobs perfectly fine this morning but a couple hours later it suddenly stopped working. I don't believe I even touched the machine in between this timeframe.
This was the entire message:
(base) C:\Users\ballcap>jupyter notebook \ --NotebookApp.allow_origin='https://col
ab.research.google.com' \ --port=8889
jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 13:50:46.680 NotebookApp] JupyterLab alpha preview extension loaded from C:\U
sers\2hinc\AppData\Local\Continuum\anaconda3\lib\site-packages\jupyterlab
JupyterLab v0.27.0
Known labextensions:
[I 13:50:46.727 NotebookApp] Running the core application with no additional ext
ensions or settings
[I 13:50:46.758 NotebookApp] Serving notebooks from local directory: C:\
[I 13:50:46.773 NotebookApp] The Jupyter Notebook is running at:
[I 13:50:46.789 NotebookApp] http://localhost:8889/?token=0af42d8c7656bc5a04fea5
0ea6623ae8bb099209d086e31b
[I 13:50:46.805 NotebookApp] Use Control-C to stop this server and shut down all
kernels (twice to skip confirmation).
[C 13:50:46.836 NotebookApp]
Copy/paste this URL into your browser when you connect for the first time,
to login with a token:
http://localhost:8889/?token=0af42d8c7656bc5a04fea50ea6623ae8bb099209d08
6e31b
[I 13:50:47.007 NotebookApp] Accepting one-time-token-authenticated connection f
rom ::1
[E 13:51:13.307 NotebookApp] Rejecting connection: Requested version (0.0.1a2) >
Current version (0.0.1a1). Please upgrade this package.
How do I rectify this in Google Colab?
The notebook server will temporarily stop sending output to the client in order to avoid crashing it.
To change this limit, set the config variable
--NotebookApp.iopub_data_rate_limit.
Current values:
NotebookApp.iopub_data_rate_limit=1000000.0 (bytes/sec)
NotebookApp.rate_limit_window=3.0 (secs)
I'm trying to connect to a local runtime, but I'm getting the following error: Error loading server extension —
.
jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 00:48:59.298 NotebookApp] JupyterLab extension loaded from C:\Users\Issstezac1\anaconda3\lib\site-pack
ages\jupyterlab
[I 00:48:59.298 NotebookApp] JupyterLab application directory is C:\Users\Issstezac1\anaconda3\share\jupy
ter\lab
[W 00:48:59.314 NotebookApp] Error loading server extension —
Traceback (most recent call last):
File "C:\Users\Issstezac1\anaconda3\lib\site-packages\notebook\notebookapp.py", line 1670, in init_
server_extensions
mod = importlib.import_module(modulename)
File "C:\Users\Issstezac1\anaconda3\lib\importlib\__init__.py", line 127, in import_module
return _bootstrap._gcd_import(name[level:], package, level)
File "<frozen importlib._bootstrap>", line 1006, in _gcd_import
File "<frozen importlib._bootstrap>", line 983, in _find_and_load
File "<frozen importlib._bootstrap>", line 965, in _find_and_load_unlocked
ModuleNotFoundError: No module named '—'
[I 00:48:59.314 NotebookApp] Serving notebooks from local directory: C:\
[I 00:48:59.314 NotebookApp] The Jupyter Notebook is running at:
[I 00:48:59.330 NotebookApp] http://localhost:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb974
5
[I 00:48:59.330 NotebookApp] or http://127.0.0.1:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbf
b9745
[I 00:48:59.330 NotebookApp] Use Control-C to stop this server and shut down all kernels (twice to skip c
onfirmation).
[C 00:48:59.392 NotebookApp]
To access the notebook, open this file in a browser:
file:///C:/Users/Issstezac1/AppData/Roaming/jupyter/runtime/nbserver-2716-open.html
Or copy and paste one of these URLs:
http://localhost:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745
or http://127.0.0.1:8888/?token=325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745
[W 00:50:09.451 NotebookApp] Blocking Cross Origin API request for /http_over_websocket. Origin: https:/
/colab.research.google.com, Host: localhost:8888
[W 00:50:09.451 NotebookApp] 403 GET /http_over_websocket?min_version=0.0.7&jupyter_http_over_ws_auth_url
=http%3A%2F%2Flocalhost%3A8888%2F%3Ftoken%3D325c5ffe303c1c7b54b2639d3c6a256550411126dbfb9745 (::1) 0.00ms
referer=None
[W 00:50:12.951 NotebookApp] Blocking Cross Origin API request for /http_over_websocket/diagnose. Origin
: https://colab.research.google.com, Host: localhost:8888
[W 00:50:12.951 NotebookApp] 403 GET /http_over_websocket/diagnose?min_version=0.0.7 (::1) 0.00ms referer
=None
I followed every step from the documentation but I still can not connect locally. Currently, I'm using Mozilla 77.0.1 (64-bit) version. Therefore, I enabled network.websocket.allowInsecureFromHTTPS
as mentioned in the specific browser settings. Even I tried the --no-browser
flag as the Jupyter HTTP-over-WebSocket troubleshooting suggest. What am I doing wrong?
Any further help will be appreciated.
I have installed both jupyter and its corresponding extension jupyter_http_over_ws. But when I try to run Colab with the local resources I get an error like this:
I have also checked that if I avoid authentication with this command:
jupyter notebook --no-browser --allow-root --NotebookApp.allow_origin='https://colab.research.google.com' --NotebookApp.token='' --NotebookApp.disable_check_xsrf=True
Everything works fine but I don't feel comfortable to work like this.
Hope someone can help with this problem.
many other jupyter notebook plugins (like e.g. jupyterlab or jupyterlab-git) automatically enable themselves via the mode described here: https://jupyter-notebook.readthedocs.io/en/stable/examples/Notebook/Distributing%20Jupyter%20Extensions%20as%20Python%20Packages.html#Automatically-enabling-a-server-extension-and-nbextension
jupyterlab for example, does this here: https://github.com/jupyterlab/jupyterlab/blob/master/setup.py#L46
it'd be great if jupyter_http_over_ws
could do the same, which would omit the need to explicitly run jupyter serverextension enable --py jupyter_http_over_ws
after installing.
Everything in the title,
Last update to 0.0.7 make impossible to connect collab to a jupyter instance through a proxy.
[Collab] --> [Local proxy ws://] --> -LAN- --> [GPU server running a jupyter instance.]
The following trace is observable on the jupyter instance :
[E 19:36:25.303 NotebookApp] Uncaught error when proxying request
Traceback (most recent call last):
File "/usr/local/lib/python2.7/dist-packages/jupyter_http_over_ws/handlers.py", line 178, in _attach_auth_cookies
_validate_same_domain(self.request, parsed_auth_url)
File "/usr/local/lib/python2.7/dist-packages/jupyter_http_over_ws/handlers.py", line 503, in _validate_same_domain
handler_domain.geturl(), url.geturl()))
ValueError: Invalid cross-domain request from http://192.168.1.119:8888 to http://localhost:8899/
[E 19:36:25.304 NotebookApp] Couldn't attach auth cookies
The port :8899 is the localhost computer where collab is launched. (my workstation)
This is also a proxy redirecting the wsocket request to an internal lan server serving jupyter : 192.168.1.119 on the port 8888.
Jupyter is launch with the following options :
jupyter notebook --NotebookApp.allow_origin='' --port=8888 --ip='0.0.0.0' --NotebookApp.port_retries=0 --NotebookApp.token='' --NotebookApp.password='' --NotebookApp.disable_check_xsrf=True --allow-root --no-browser
Was working on 0.0.6. Any advices ?
** CAN'T CONNECT to ColabLocal
System:
OS: Ubuntu 18.04
Install script
apt -y install python3 python3-pip
ln -s $(which python3) /usr/local/bin/python
pip --no-cache-dir install --upgrade pip
pip3 --no-cache-dir install --upgrade jupyter
pip3 --no-cache-dir install --upgrade jupyter_http_over_ws
jupyter serverextension enable --py jupyter_http_over_ws
# Load Jupyter Server
jupyter notebook --NotebookApp.allow_origin='https://colab.research.google.com' --port=8081 \
--allow-root --NotebookApp.port_retries=0 --NotebookApp.token='' --NotebookApp.password='' \
--NotebookApp.disable_check_xsrf=True --ip="*" --no-browser
#12 I check this, but this still cant connect.
[I 18:18:13.963 NotebookApp] KernelRestarter: restarting kernel (1/5), keep random ports
[E 18:18:24.998 NotebookApp] Uncaught error when proxying request
Traceback (most recent call last):
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 446, in _get_proxied_ws
client = yield self._proxied_ws_future
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 742, in run
yielded = self.gen.throw(*exc_info) # type: ignore
File "/opt/ve/lib/python3.6/site-packages/jupyter_http_over_ws/handlers.py", line 437, in _on_open
proxy_request, on_message_callback=self._on_proxied_message)
File "/opt/ve/lib/python3.6/site-packages/tornado/gen.py", line 735, in run
value = future.result()
tornado.simple_httpclient.HTTPTimeoutError: Timeout during request
Hi,
I am attempting to connect to a VM instance I started from GCP so that I can take advantage of a GPU in Google Colab. I am following the instructions here. I also attempted solutions from the issues raised here and here but they didn't work--I think my problem is somewhat different. I am using the following commands
First in Google Cloud SDK:
gcloud compute ssh --zone "us-central1-a" [instance_name] --project [project_name] -- -L 8889:locahost:8889
Then, in the instance that starts up:
jupyter serverextension enable --py jupyter_http_over_ws
jupyter notebook --NotebookApp.allow_origin="https://colab.research.google.com" --port=8889 --NotebookApp.port_retries=0 --no-browser
After those, my instance generates the url as expected, but when I paste it into my browser it says 'The site can't be reached.' (Connection reset error).
Any help or suggestions would be greatly appreciated!
When connecting colab to a Jupyter server the frontend application sends URL encoded parameters to /api/sessions
which causes the API to return hard to read names. While this is coming from the frontend, if the backend needs the URL encoding this can be undone in this package when forwarding the API request to the Jupyter server.
This can be seen by creating a notebook in drive with a character which needs to be escaped and looking at the websocket connections that are happening to the handler this package provides.
An example response from /api/sessions
after loading such a notebook might be the following:
[
{
"id": "fa293a7e-afb2-4fca-92c2-ffb9e548e052",
"kernel": {
"connections": 0,
"execution_state": "idle",
"id": "0ece430d-e288-4907-973f-d58a6d8010ff",
"last_activity": "2021-02-17T06:31:48.663580Z",
"name": "Python"
},
"name": "%5Btest%5D%20example%20notebook.ipynb",
"notebook": {
"name": "%5Btest%5D%20example%20notebook.ipynb",
"path": "fileId=?????????????????????????????????"
},
"path": "fileId=?????????????????????????????????",
"type": "notebook"
}
]
While the expected response would instead include the string [test] example notebook.ipynb
.
Hi, I'm having trouble connecting to local runtime. This is the error I am getting:
`(base) C:\Users\harol>jupyter notebook \ --
NotebookApp.allow_origin='https://colab.research.google.com'\ --port=8888 \ --
NotebookApp.port_retries=0 --no-browser`
jupyter_http_over_ws extension initialized. Listening on /http_over_websocket
[I 01:29:45.989 NotebookApp] JupyterLab extension loaded from
C:\Users\harol\anaconda3\lib\site-packages\jupyterlab
[I 01:29:45.989 NotebookApp] JupyterLab application directory is
C:\Users\harol\anaconda3\share\jupyter\lab
[I 01:29:46.005 NotebookApp] Serving notebooks from local directory: C:\
[I 01:29:46.006 NotebookApp] The Jupyter
I've followed the instructions in setting it up and browsed all of the issues if I have similar issues.
Any suggestions are greatly appreciated. Thanks
enum34
isn't needed on Python 3.6 or later, and is incompatible (it doesn't include enum.IntFlag
).if
statement in setup.py
.enum34
is always installed.enum.IntFlag
can then fail to install, or possibly even run.Solving this:
I'm following the instructions described here.
The very second step to enable jupyter_http_over_ws
jupyter extension gives following error:
Traceback (most recent call last):
File "/usr/local/bin/jupyter-serverextension", line 8, in <module>
sys.exit(main())
File "/usr/local/lib/python2.7/dist-packages/jupyter_core/application.py", line 267, in launch_instance
return super(JupyterApp, cls).launch_instance(argv=argv, **kwargs)
File "/usr/local/lib/python2.7/dist-packages/traitlets/config/application.py", line 658, in launch_instance
app.start()
File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 293, in start
super(ServerExtensionApp, self).start()
File "/usr/local/lib/python2.7/dist-packages/jupyter_core/application.py", line 256, in start
self.subapp.start()
File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 210, in start
self.toggle_server_extension_python(arg)
File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 199, in toggle_server_extension_python
m, server_exts = _get_server_extension_metadata(package)
File "/usr/local/lib/python2.7/dist-packages/notebook/serverextensions.py", line 327, in _get_server_extension_metadata
m = import_item(module)
File "/usr/local/lib/python2.7/dist-packages/traitlets/utils/importstring.py", line 42, in import_item
return __import__(parts[0])
ImportError: No module named jupyter_http_over_ws
Tried installing tornado==5.1
as mentioned in #5
Also tried upgrading the package:
pip install --upgrade jupyter_http_over_ws
I am trying to connect it with google colab using this command
jupyter notebook
--NotebookApp.allow_origin='https://colab.research.google.com'
--port=8081
--NotebookApp.port_retries=0
After this local machine is running on the same port but not able to connect it with colab. I tried restarting my notebook and generating new tokens disabling all the firewalls. Then tried ngrok for tunneling the token URL but i stuck in between where to use that tunneling URL. Then I tried changing the jupyter_notebook_config.py file by adding c.NotebookApp.allow_remote_access = True as per bing said, but then aslo it was not connecting.
Can anyone please help me with this or can suggest any other way where can I use my localhost on Internet for changing and debuggning .ipynb file located on local server.
I recently switched to running my python tensorflow code on colab from my local machine and used the same code to run the models(all I did for the entire code was change one directory path - i.e. os.getcwd() ) so I wanted to check if colab saves tensorboard files a bit differently before I chase my tail.
It seems that tf.summary.FileWriter creates a folder for the event files but the event files themselves don't appear in my google drive.
This is how I save it (rando is a random number so I am definitely able to find that distinct folder) :
train_writer = tf.summary.FileWriter('{}/train_{}'.format(fileprefix,rando), graph=graph)
I just spent sometime to connect Colab to my local runtime which is packed with dependencies built with Bazel... just think it's cool cause it matches how things work at Google. I wonder whether it would beneficial to showcase examples like this to the users.
Hello Colab team!
I'm having a few issues since the Colab update when I'm trying to connect a Compute Engine VM as backend.
pip install --upgrade jupyter_http_over_ws>=0.0.7
can only be ran as sudo. This makes is so thatjupyter serverextension enable --py jupyter_http_over_ws
and the Jupyter kernel init (using --allow-root) have to be ran as sudo too because if you don't, they're using the regular user's config (so version 0.0.6, which doesn't work with Colab...) and 0.0.7 is only installed for root.
Colab is then able to connect to the VM, but this seems to lead to some weird issues where f-string are considered syntax errors despite my Python version being 3.5.3, pubsub_v1 not found afterpip install --upgrade google-cloud-pubsub
, etc... for example.The worst is, I tried with another, simpler and cpu-only VM I just created and it works perfectly. I don't understand this seemingly random behavior
Am I missing something obvious or is it a known issue with Compte Engine VMs?
Thank you!
tl;dr: some Compute Engine VMs + jupyter_http_over_ws + Colab = weird errors
I rerun everything without panicking and it worked just fine... I think it was just caused by trying to jump steps and some installs I did in the notebook that broke some dependencies. Point taken.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.