10
I Use This!
Low Activity

News

Analyzed 2 days ago. based on code collected 2 days ago.
Posted about 4 years ago by JackM15
I'm currently attempting to get Flask-Mail working on heroku but keep running into various issues. The general code is like so: from threading import Thread from flask import render_template, current_app from flask_mail import Message ... [More] from app import mail def send_async_email(app, msg): with app.app_context(): mail.send(msg) print("message sent) def send_email(subject, sender, recipients, text_body, html_body): msg = Message(subject, sender=sender, recipients=recipients) msg.body = text_body msg.html = html_body Thread(target=send_async_email, args=(current_app._get_current_object(), msg)).start() Pretty much this: https://blog.miguelgrinberg.com/post/the-flask-mega-tutorial-part-x-email-support Now my issue appears to be when i run it on heroku. On my machine/dev environment the code runs through fine and i get to the "message sent" print statement, although i don't get an email in my inbox but thats fine, it executes without error and i can tackle the lack of email arriving after i've managed to get it executing the same way on Heroku as it does on my development machine.. When i'm running the same code on Heroku i get the following message in my logs: 2021-04-06T10:14:41.736652+00:00 app[web.1]: Exception in thread Thread-2: 2021-04-06T10:14:41.736661+00:00 app[web.1]: Traceback (most recent call last): 2021-04-06T10:14:41.736662+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 425, in resolve 2021-04-06T10:14:41.736662+00:00 app[web.1]: use_network=use_network) 2021-04-06T10:14:41.736663+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 380, in query 2021-04-06T10:14:41.736663+00:00 app[web.1]: return end() 2021-04-06T10:14:41.736664+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 359, in end 2021-04-06T10:14:41.736664+00:00 app[web.1]: raise result[1] 2021-04-06T10:14:41.736664+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 340, in step 2021-04-06T10:14:41.736665+00:00 app[web.1]: a = fun(*args, **kwargs) 2021-04-06T10:14:41.736665+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/dns/resolver.py", line 1091, in query 2021-04-06T10:14:41.736666+00:00 app[web.1]: True) 2021-04-06T10:14:41.736667+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/dns/resolver.py", line 1043, in resolve 2021-04-06T10:14:41.736667+00:00 app[web.1]: timeout = self._compute_timeout(start, lifetime) 2021-04-06T10:14:41.736668+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/dns/resolver.py", line 950, in _compute_timeout 2021-04-06T10:14:41.736668+00:00 app[web.1]: raise Timeout(timeout=duration) **2021-04-06T10:14:41.736668+00:00 app[web.1]: dns.exception.Timeout: The DNS operation timed out after 5.107004642486572 seconds** 2021-04-06T10:14:41.736669+00:00 app[web.1]: 2021-04-06T10:14:41.736669+00:00 app[web.1]: During handling of the above exception, another exception occurred: 2021-04-06T10:14:41.736670+00:00 app[web.1]: 2021-04-06T10:14:41.736670+00:00 app[web.1]: Traceback (most recent call last): 2021-04-06T10:14:41.736671+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 916, in _bootstrap_inner 2021-04-06T10:14:41.736671+00:00 app[web.1]: self.run() 2021-04-06T10:14:41.736672+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 864, in run 2021-04-06T10:14:41.736672+00:00 app[web.1]: self._target(*self._args, **self._kwargs) 2021-04-06T10:14:41.736672+00:00 app[web.1]: File "/app/app/email.py", line 9, in send_async_email 2021-04-06T10:14:41.736673+00:00 app[web.1]: mail.send(msg) 2021-04-06T10:14:41.736673+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 491, in send 2021-04-06T10:14:41.736674+00:00 app[web.1]: with self.connect() as connection: 2021-04-06T10:14:41.736674+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 144, in __enter__ 2021-04-06T10:14:41.736675+00:00 app[web.1]: self.host = self.configure_host() 2021-04-06T10:14:41.736675+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 158, in configure_host 2021-04-06T10:14:41.736675+00:00 app[web.1]: host = smtplib.SMTP(self.mail.server, self.mail.port) 2021-04-06T10:14:41.736676+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 251, in __init__ 2021-04-06T10:14:41.736676+00:00 app[web.1]: (code, msg) = self.connect(host, port) 2021-04-06T10:14:41.736677+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 336, in connect 2021-04-06T10:14:41.736677+00:00 app[web.1]: self.sock = self._get_socket(host, port, self.timeout) 2021-04-06T10:14:41.736678+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 307, in _get_socket 2021-04-06T10:14:41.736678+00:00 app[web.1]: self.source_address) 2021-04-06T10:14:41.736679+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/green/socket.py", line 44, in create_connection 2021-04-06T10:14:41.736679+00:00 app[web.1]: for res in getaddrinfo(host, port, 0, SOCK_STREAM): 2021-04-06T10:14:41.736680+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 517, in getaddrinfo 2021-04-06T10:14:41.736680+00:00 app[web.1]: qname, addrs = _getaddrinfo_lookup(host, family, flags) 2021-04-06T10:14:41.736681+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 490, in _getaddrinfo_lookup 2021-04-06T10:14:41.736682+00:00 app[web.1]: raise err 2021-04-06T10:14:41.736682+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736682+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736683+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736683+00:00 app[web.1]: raise EAI_EAGAIN_ERROR 2021-04-06T10:14:41.736684+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736684+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736685+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736685+00:00 app[web.1]: raise EAI_EAGAIN_ERROR 2021-04-06T10:14:41.736685+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 916, in _bootstrap_inner 2021-04-06T10:14:41.736686+00:00 app[web.1]: self.run() 2021-04-06T10:14:41.736686+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 864, in run 2021-04-06T10:14:41.736687+00:00 app[web.1]: self._target(*self._args, **self._kwargs) 2021-04-06T10:14:41.736687+00:00 app[web.1]: File "/app/app/email.py", line 9, in send_async_email 2021-04-06T10:14:41.736688+00:00 app[web.1]: mail.send(msg) 2021-04-06T10:14:41.736688+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 491, in send 2021-04-06T10:14:41.736688+00:00 app[web.1]: with self.connect() as connection: 2021-04-06T10:14:41.736689+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 144, in __enter__ 2021-04-06T10:14:41.736689+00:00 app[web.1]: self.host = self.configure_host() 2021-04-06T10:14:41.736689+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 158, in configure_host 2021-04-06T10:14:41.736690+00:00 app[web.1]: host = smtplib.SMTP(self.mail.server, self.mail.port) 2021-04-06T10:14:41.736697+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 251, in __init__ 2021-04-06T10:14:41.736697+00:00 app[web.1]: (code, msg) = self.connect(host, port) 2021-04-06T10:14:41.736698+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 336, in connect 2021-04-06T10:14:41.736698+00:00 app[web.1]: self.sock = self._get_socket(host, port, self.timeout) 2021-04-06T10:14:41.736698+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 307, in _get_socket 2021-04-06T10:14:41.736699+00:00 app[web.1]: self.source_address) 2021-04-06T10:14:41.736699+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/green/socket.py", line 44, in create_connection 2021-04-06T10:14:41.736700+00:00 app[web.1]: for res in getaddrinfo(host, port, 0, SOCK_STREAM): 2021-04-06T10:14:41.736700+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 517, in getaddrinfo 2021-04-06T10:14:41.736701+00:00 app[web.1]: qname, addrs = _getaddrinfo_lookup(host, family, flags) 2021-04-06T10:14:41.736701+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 490, in _getaddrinfo_lookup 2021-04-06T10:14:41.736701+00:00 app[web.1]: raise err 2021-04-06T10:14:41.736720+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736721+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736722+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736722+00:00 app[web.1]: raise EAI_EAGAIN_ERROR 2021-04-06T10:14:41.736722+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736723+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736723+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736723+00:00 app[web.1]: raise EAI_EAGAIN_ERROR 2021-04-06T10:14:41.736724+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 916, in _bootstrap_inner 2021-04-06T10:14:41.736724+00:00 app[web.1]: self.run() 2021-04-06T10:14:41.736725+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/threading.py", line 864, in run 2021-04-06T10:14:41.736725+00:00 app[web.1]: self._target(*self._args, **self._kwargs) 2021-04-06T10:14:41.736725+00:00 app[web.1]: File "/app/app/email.py", line 9, in send_async_email 2021-04-06T10:14:41.736726+00:00 app[web.1]: mail.send(msg) 2021-04-06T10:14:41.736727+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 491, in send 2021-04-06T10:14:41.736727+00:00 app[web.1]: with self.connect() as connection: 2021-04-06T10:14:41.736728+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 144, in __enter__ 2021-04-06T10:14:41.736728+00:00 app[web.1]: self.host = self.configure_host() 2021-04-06T10:14:41.736728+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/flask_mail.py", line 158, in configure_host 2021-04-06T10:14:41.736729+00:00 app[web.1]: host = smtplib.SMTP(self.mail.server, self.mail.port) 2021-04-06T10:14:41.736729+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 251, in __init__ 2021-04-06T10:14:41.736730+00:00 app[web.1]: (code, msg) = self.connect(host, port) 2021-04-06T10:14:41.736730+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 336, in connect 2021-04-06T10:14:41.736730+00:00 app[web.1]: self.sock = self._get_socket(host, port, self.timeout) 2021-04-06T10:14:41.736731+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/smtplib.py", line 307, in _get_socket 2021-04-06T10:14:41.736731+00:00 app[web.1]: self.source_address) 2021-04-06T10:14:41.736732+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/green/socket.py", line 44, in create_connection 2021-04-06T10:14:41.736732+00:00 app[web.1]: for res in getaddrinfo(host, port, 0, SOCK_STREAM): 2021-04-06T10:14:41.736732+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 517, in getaddrinfo 2021-04-06T10:14:41.736733+00:00 app[web.1]: qname, addrs = _getaddrinfo_lookup(host, family, flags) 2021-04-06T10:14:41.736733+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 490, in _getaddrinfo_lookup 2021-04-06T10:14:41.736733+00:00 app[web.1]: raise err 2021-04-06T10:14:41.736734+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736734+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736734+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736735+00:00 app[web.1]: raise EAI_EAGAIN_ERROR 2021-04-06T10:14:41.736735+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 479, in _getaddrinfo_lookup 2021-04-06T10:14:41.736735+00:00 app[web.1]: answer = resolve(host, qfamily, False, use_network=use_network) 2021-04-06T10:14:41.736736+00:00 app[web.1]: File "/app/.heroku/python/lib/python3.6/site-packages/eventlet/support/greendns.py", line 432, in resolve 2021-04-06T10:14:41.736736+00:00 app[web.1]: raise EAI_EAGAIN_ERROR **2021-04-06T10:14:41.736737+00:00 app[web.1]: socket.gaierror: [Errno -3] Lookup timed out** I've tried various solutions, changing ports, changing email providers (gmail and webhost supplied email) but cant seem to find a solution. Any ideas? [Less]
Posted about 4 years ago by Skel
I'm having trouble running Flask & SocketIO with Eventlet despite using socketio.run(), any suggestions are appreciated. I'm currently on Python 3.9 and I've tried multiple different versions of each of these modules with no avail. ... [More] [2021-04-04 06:39:05,709] WARNING in __init__: Flask-SocketIO is Running under Werkzeug, WebSocket is not available. "GET /socket.io/?EIO=4&transport=websocket HTTP/1.1" 400 - index.html SAR ping app.py from flask import Flask, render_template from flask_socketio import SocketIO, emit import eventlet app = Flask(__name__) socketio = SocketIO(app, logger=True) @app.route( '/' ) def index(): return render_template( 'index.html') def receivedCallback(): print('Pong received by user!') @socketio.on( 'ping' ) def handle_ping(data): print("received", data) socketio.emit('pong', "pong_data", callback=receivedCallback) if __name__ == '__main__': socketio.run(app) [Less]
Posted about 4 years ago by Dima Berehovets
I meet a problem using eventlet with celery. I run a celery worker the following way celery worker -P eventlet -A project -c 400 -Q default -l info -n default And it should work with 400 threads. However, when I monitor the process with ... [More] celery flower, it only shows that about 200 threads are working. There are more than 1000 tasks generated for this worker. Can everyone say what is happening here? Why doesn't celery use 400 threads but up to 200 only? [Less]
Posted about 4 years ago by Tiberiu
This code: main.py import eventlet eventlet.monkey_patch() # commenting this line makes the app run correctly import requests response = requests.get('http://localhost:8080/position') # exception thrown here throws the following ... [More] exception: requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=8080): Max retries exceeded with url: /position (Caused by NewConnectionError(': Failed to establish a new connection: [Errno 11001] No address found')) However, if I comment the eventlet.monkey_patch(), it works. I must do monkey patching because I'm currently using Flask-SocketIO with eventlet and would like to have background threads (as per their documentation). Does anyone know how to fix this? Versions: eventlet: 0.30.2, requests: 2.25.1 [Less]
Posted about 4 years ago by zambs
I am new to server development so please be kind... I am developing a test application that starts a flask-socketio server and after interacting with a clients, it needs to shutdown and open another instance. However this is not possible ... [More] I get error File "C:\Python39\lib\site-packages\eventlet\convenience.py", line 78, in listen sock.bind(addr) OSError: [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted How can I programmatically shutdown the server? I looked in answers here How to stop flask application without using ctrl-c and using a process indeed does the trick. But I don't really want to have a separate process because sharing the variables between process is too tricky. I also didn't understand from the same post how to send a request from the server to the server itself in order to shutdown the flask application. This is an example of my code import socketio import eventlet import eventlet.wsgi from flask import Flask, render_template import socket import threading import time ip_addr=socket.gethostbyname(socket.gethostname()) appFlask = Flask(__name__) sio = socketio.Server( ) #engineio_logger=True,logger=True) # wrap Flask application with engineio's middleware app = socketio.Middleware(sio, appFlask) @sio.on('connect') def connect(sid, environ): print('connect ', sid) @sio.on('message') def message(sid, data): print('message '+data, data) @sio.on('disconnect') def disconnect(sid): print('disconnect ', sid) @sio.on('result') def result(sid,data): print('result ', sid) def worker1(): socket_port=3000 eventlet.wsgi.server(eventlet.listen((ip_addr, socket_port)), app) if __name__ == '__main__': sio.start_background_task(worker1) # do some stuff and interact with the client sio.sleep(2) # how can I close the server so that I can do the following? sio.start_background_task(worker1) EDITED wit flask socket io functionality import socketio import eventlet import eventlet.wsgi from flask import Flask, render_template import socket import threading import time import requests from flask import request from flask_socketio import SocketIO ip_addr=socket.gethostbyname(socket.gethostname()) socket_port=3000 app = Flask(__name__) app.config['SECRET_KEY'] = 'secret!' sio = SocketIO(app) @app.route('/stop') def stop(): sio.stop() @sio.on('connect') def connect(sid, environ): print('connect ', sid) @sio.on('message') def message(sid, data): print('message '+data, data) @sio.on('disconnect') def disconnect(sid): print('disconnect ', sid) @sio.on('result') def result(sid,data): print('result ', sid) def worker1(): eventlet.wsgi.server(eventlet.listen((ip_addr, socket_port)), app) if __name__ == '__main__': eventlet_thr=sio.start_background_task(worker1) # do some stuff and interact with the client sio.sleep(2) # now wait that the server is stopped # invoke in a different process a request to stop eventlet_thr.join() # how can I close the server so that I can do the following? sio.start_background_task(worker1) [Less]
Posted about 4 years ago by zambs
I am new to server development so please be kind... I am developing a test application that starts a flask-socketio server and after interacting with a clients, it needs to shutdown and open another instance. However this is not possible ... [More] I get error File "C:\Python39\lib\site-packages\eventlet\convenience.py", line 78, in listen sock.bind(addr) OSError: [WinError 10048] Only one usage of each socket address (protocol/network address/port) is normally permitted How can I programmatically shutdown the server? I looked in answers here How to stop flask application without using ctrl-c and using a process indeed does the trick. But I don't really want to have a separate process because sharing the variables between process is too tricky. I also didn't understand from the same post how to send a request from the server to the server itself in order to shutdown the flask application. This is an example of my code import socketio import eventlet import eventlet.wsgi from flask import Flask, render_template import socket import threading import time ip_addr=socket.gethostbyname(socket.gethostname()) appFlask = Flask(__name__) sio = socketio.Server( ) #engineio_logger=True,logger=True) # wrap Flask application with engineio's middleware app = socketio.Middleware(sio, appFlask) @sio.on('connect') def connect(sid, environ): print('connect ', sid) @sio.on('message') def message(sid, data): print('message '+data, data) @sio.on('disconnect') def disconnect(sid): print('disconnect ', sid) @sio.on('result') def result(sid,data): print('result ', sid) def worker1(): socket_port=3000 eventlet.wsgi.server(eventlet.listen((ip_addr, socket_port)), app) if __name__ == '__main__': sio.start_background_task(worker1) # do some stuff and interact with the client sio.sleep(2) # how can I close the server so that I can do the following? sio.start_background_task(worker1) [Less]
Posted about 4 years ago by Paul Brink
I have a simple Python 3 application running python-socketIO. I use it to stream a list of nearly 400 000 values to a client application @ 30 times each second. I have incorporated compression to reduce the network bandwith to a more ... [More] manageable level. Testing the application works fine although its performance degrades very quickly. With two clients connected the frame rate is nearly 20. With 4-5 clients, the frame rate is less than 10. Why is the performance of the application degradin so quickly, and how can this be fixed? I use this command to spin up the server. gunicorn -k eventlet -w 1 --reload server:app Here is my code #IMPORTS import socketio import driver as customData import threading import lz4.frame #SERVER SETUP sio = socketio.Server() app = socketio.WSGIApp(sio, static_files={ '/':'./clientView/' }) #VARIABLE SETUP stream = False #FUNCTION DEFINITIONS def stream_Data(): global stream while stream == True: sio.emit('dataStream', lz4.frame.compress(customData.stream_Raw(), 9)) def startStream(): global stream if stream == True: print('Stream Requested: Already in Progress...') elif stream == False: print('Stream Initialized...') stream = True STREAM_THREAD.start() #THREAD SETUP STREAM_THREAD = threading.Thread(target = stream_Data) #SOCKET EVENTS @sio.event def connect(sid, environ): print(sid, 'connected') @sio.event def disconnect(sid): print(sid, 'disconnected') @sio.event def streamEvent(sid, data): global stream if data == 'stream': startStream() stream = True if data == 'stop_Stream': stream = False [Less]
Posted about 4 years ago by slalomchip
I want to create a SocketIO server in Python on Windows as a simulator for a SocketIO client I’m writing. The server uses eventlet to listen to local port 0.0.0.0. The simulator uses PyQt5 and has two buttons. One button emits one ... [More] message from the server and the other button emits a different message from the server. Upon execution, the client connects to the server without issue but the QDialog hangs and the QPushButtons are not displayed. If I comment out the line that begins with eventlet = then the QDialog displays without issue but [obviously] the client cannot connect to the server. Any suggestions how I can overcome this issue so that I can connect from the client and also emit server messages by clicking the QPushButtons? Here is my server script: from PyQt5.QtWidgets import QPushButton, QDialog, QApplication import socketio, sys, eventlet class My_Server(QDialog): def __init__(self, parent=None): super(My_Server, self).__init__(parent) self.setWindowTitle("My SocketIO Server") self.resize(300,150) self.move(300, 200) self.btn1 = QPushButton(self) self.btn1.setText('Msg 1') self.btn1.move(50,75) self.btn1.clicked.connect(self.send_btn1) self.btn2 = QPushButton(self) self.btn2.setText('Msg 2') self.btn2.move(175,75) self.btn2.clicked.connect(self.send_btn2) self.show() self.sio = socketio.Server() self.serverapp = socketio.WSGIApp(self.sio, static_files={'/': {'content_type': 'text/html', 'filename': 'index.html'}}) eventlet.wsgi.server(eventlet.listen(('', 5000)), self.serverapp) def send_btn1(self): self.sio.emit('message1', {"Message 1": "Hello"}) def send_btn2(self): self.sio.emit('message2', {"Message 2": "World"}) if __name__ == '__main__': app = QApplication(sys.argv) form = My_Server() form.show() sys.exit(app.exec_()) [Less]
Posted about 4 years ago by Paul Brink
I have a simple python server running socket.io Basically what I want to do is make it constantly emit new data, kind of like a video stream. This should happen without any delay, a few times per second. Here is the code I have so far. ... [More] The problem is that it whenever the stream event starts, no other clients can connect. Basically the print statement in the steamEvent will be changed with emitting the actual data once I can figure out how to keep the server responsive import socketio iterations = 1 sio = socketio.Server() app = socketio.WSGIApp(sio, static_files={ '/':'./clientView/' }) @sio.event def connect(sid, environ): print(sid, 'connected') streamEvent() @sio.event def disconnect(sid): print(sid, 'disconnected') @sio.event def testEvent(sid, data): print(sid, data) sio.emit('testRecieved', 'test data recieved') def streamEvent(): while iterations == 1: print('streaming') [Less]
Posted about 4 years ago by Mick
I'm trying to optimize requests through an external proxy (rotator). Sometimes the response is fast, sometimes very slow. So the idea is to send multiple requests in parallel of the same url request, take the fastest response, return the ... [More] data, close the function without waiting for the other slower response(s). There are a lot of tutorials online and SO questions regarding parallel requests in python, but all of them are for parallel requests of different requests instead of a duplicate request. Additionally the code waits until all requests are finished. I want to kind kill the parallel requests logic (preferably in a clean way) once the fastest response answers. My app is running in Python Flask and runs with Gunicorn + Eventlet. I tried Eventlet green pools and Python Concurrent Futures, but using an Eventlet Greenpool seems like a better match, since the code will run in Gunicorn + Eventlet workers and Celery with Eventlet workers. Im currently using Luminati Proxy Manager (LPM) to retry failed requests. An older version seemed to support parallel requests in the box, but the current versions do not support this function anymore. So I'm either trying to solve it with code in my Python app, or add another service/ tool (like LPM) that takes care of parallel requests and picks the fastest one. Proxy service Luminati.io provides a 'high performance parallel request' code example (based on Eventlet Greenpool). See 'original example' I edited to code without a proxy and login's to make it more repeatable and avoid unpredictable proxy response timings. I'm not getting any support from Luminati, so I'm trying figure it out on SO. For this test I'm using simulated slow 5 sec response, and a fast response from httpstat.us: ['http://httpstat.us/200?sleep=5000','http://httpstat.us/200'] In the edited code I added print statements with timings to see which response comes back first. I got two problems with this code. Sometimes I can see the fast response coming back first and it prints the response data ('OK'), and the slow respone 5 sec later. However, often it seems like the code waits until both resposnes are back (both timings exactly the same). The other problem is that while I'm able to print and see the data immidiatly of the 'fast' response, the logic still waits until all responses are finished. I would like to return the data and close the function once the first response comes back. In my edited code you can see some code (commented out lines) were i tried to unsucessfilly kill the process (this however just restarts the eventlet process). Original example import eventlet from eventlet.green.urllib import request import random import socket super_proxy = socket.gethostbyname('zproxy.lum-superproxy.io') class SingleSessionRetriever: url = "http://%s-session-%s:%s@"+super_proxy+":%d" port = 22225 def __init__(self, username, password, requests_limit, failures_limit): self._username = username self._password = password self._requests_limit = requests_limit self._failures_limit = failures_limit self._reset_session() def _reset_session(self): session_id = random.random() proxy = SingleSessionRetriever.url % (self._username, session_id, self._password, SingleSessionRetriever.port) proxy_handler = request.ProxyHandler({'http': proxy, 'https': proxy}) self._opener = request.build_opener(proxy_handler) self._requests = 0 self._failures = 0 def retrieve(self, url, timeout): while True: if self._requests == self._requests_limit: self._reset_session() self._requests += 1 try: timer = eventlet.Timeout(timeout) result = self._opener.open(url).read() timer.cancel() return result except: timer.cancel() self._failures += 1 if self._failures == self._failures_limit: self._reset_session() class MultiSessionRetriever: def __init__(self, username, password, session_requests_limit, session_failures_limit): self._username = username self._password = password self._sessions_stack = [] self._session_requests_limit = session_requests_limit self._session_failures_limit = session_failures_limit def retrieve(self, urls, timeout, parallel_sessions_limit, callback): pool = eventlet.GreenPool(parallel_sessions_limit) for url, body in pool.imap(lambda url: self._retrieve_single(url, timeout), urls): callback(url, body) def _retrieve_single(self, url, timeout): if self._sessions_stack: session = self._sessions_stack.pop() else: session = SingleSessionRetriever(self._username, self._password, self._session_requests_limit, self._session_failures_limit) body = session.retrieve(url, timeout) self._sessions_stack.append(session) return url, body def output(url, body): print(body) n_total_req = 100 req_timeout = 10 n_parallel_exit_nodes = 10 switch_ip_every_n_req = 10 max_failures = 2 MultiSessionRetriever('lum-customer-c_ba028d72-zone-static', 'akssw3iy6h3y', switch_ip_every_n_req, max_failures).retrieve( ["http://lumtest.com/myip.json"] * n_total_req, req_timeout, n_parallel_exit_nodes, output) Edited code (without login's and proxies) def high_perf_parallel_requests(search_url): try: import datetime from eventlet.green.urllib import request results2 = [] results1 = [] class SingleSessionRetriever: def __init__(self, username, password, requests_limit, failures_limit): self._username = username self._password = password self._requests_limit = requests_limit self._failures_limit = failures_limit self._reset_session() def _reset_session(self): self._requests = 0 self._failures = 0 def retrieve(self, url, timeout): print("\n SingleSessionRetriever.retrieve init") print(url) print(datetime.datetime.now()) while True: if self._requests == self._requests_limit: self._reset_session() self._requests += 1 try: timer = eventlet.Timeout(timeout) result = request.urlopen(url).read() print("\n SingleSessionRetriever.retrieve result") print(url) print(result) print(datetime.datetime.now()) results1.append(result) timer.cancel() # eventlet.kill(pool) # raise Exception("Got fastest result. Kill eventlet") #eventlet.kill(self) #pool.kill() return result except: timer.cancel() self._failures += 1 if self._failures == self._failures_limit: self._reset_session() class MultiSessionRetriever: def __init__(self, username, password, session_requests_limit, session_failures_limit): self._returned = False self._username = username self._password = password self._sessions_stack = [] self._session_requests_limit = session_requests_limit self._session_failures_limit = session_failures_limit def retrieve(self, urls, timeout, parallel_sessions_limit, callback): pool = eventlet.GreenPool(parallel_sessions_limit) try: # for url in urls: # print("spawn {}".format(url)) # pool.spawn_n(self._retrieve_single(url, timeout)) #pool.waitall() for url, body in pool.imap(lambda url: self._retrieve_single(url, timeout), urls): if body: print("\n MultiSessionRetriever.retrieve: Body received") print(datetime.datetime.now()) # eventlet.Event.send_exception #return body #eventlet.kill(self) # pool.kill() print("\n MultiSessionRetriever.retrieve: in for loop") print(url) print(body) print(datetime.datetime.now()) callback(url, body) except Exception as e: # eventlet.kill(pool) # eventlet.kill(self) print(e) print("\n MultiSessionRetriever.retrieve: after loop") print(datetime.datetime.now()) # eventlet.kill(self) def _retrieve_single(self, url, timeout): print("\n MultiSessionRetriever._retrieve_single url:") print(url) print(datetime.datetime.now()) if self._sessions_stack: session = self._sessions_stack.pop() else: session = SingleSessionRetriever(self._username, self._password, self._session_requests_limit, self._session_failures_limit) body = session.retrieve(url, timeout) print("\n MultiSessionRetriever._retrieve_single body:") print(body) print(datetime.datetime.now()) self._sessions_stack.append(session) return url, body def output(url, body): print("\n MultiSessionRetriever.output:") print(url) print(body) print(datetime.datetime.now()) results2.append(body) # n_total_req = 2 req_timeout = 10 n_parallel_exit_nodes = 2 switch_ip_every_n_req = 1 max_failures = 2 urls = ['http://httpstat.us/200?sleep=5000','http://httpstat.us/200'] print("start") print(datetime.datetime.now()) x = MultiSessionRetriever('', '', switch_ip_every_n_req, max_failures).retrieve( urls, req_timeout, n_parallel_exit_nodes, output) print("result1:") print(results1) print("result2:") print(results2) return results2 Console output (I used two other urls that respond with Fast and Slow as response text). web_1 | high_perf_parallel_requests: start web_1 | start web_1 | 2021-02-04 02:28:17.503574 web_1 | web_1 | MultiSessionRetriever._retrieve_single url: web_1 | http://httpstat.us/200?sleep=5000 web_1 | 2021-02-04 02:28:17.503903 web_1 | web_1 | SingleSessionRetriever.retrieve init web_1 | http://httpstat.us/200?sleep=5000 web_1 | 2021-02-04 02:28:17.503948 web_1 | web_1 | MultiSessionRetriever._retrieve_single url: web_1 | http://httpstat.us/200 web_1 | 2021-02-04 02:28:17.511720 web_1 | web_1 | SingleSessionRetriever.retrieve init web_1 | http://httpstat.us/200 web_1 | 2021-02-04 02:28:17.511783 web_1 | web_1 | SingleSessionRetriever.retrieve result web_1 | http://httpstat.us/200 web_1 | b'"fast response result"\n' web_1 | 2021-02-04 02:28:18.269042 web_1 | web_1 | MultiSessionRetriever._retrieve_single body: web_1 | b'"fast response result"\n' web_1 | 2021-02-04 02:28:18.269220 web_1 | web_1 | SingleSessionRetriever.retrieve result web_1 | http://httpstat.us/200?sleep=5000 web_1 | b'"slow response result"\n' web_1 | 2021-02-04 02:28:24.458372 web_1 | web_1 | MultiSessionRetriever._retrieve_single body: web_1 | b'"slow response result"\n' web_1 | 2021-02-04 02:28:24.458499 web_1 | web_1 | MultiSessionRetriever.retrieve: Body received web_1 | 2021-02-04 02:28:24.458814 web_1 | web_1 | MultiSessionRetriever.retrieve: in for loop web_1 | http://httpstat.us/200?sleep=5000 web_1 | b'"slow response result"\n' web_1 | 2021-02-04 02:28:24.458857 web_1 | web_1 | MultiSessionRetriever.output: web_1 | http://httpstat.us/200?sleep=5000 web_1 | b'"slow response result"\n' web_1 | 2021-02-04 02:28:24.458918 web_1 | web_1 | MultiSessionRetriever.retrieve: Body received web_1 | 2021-02-04 02:28:24.459057 web_1 | web_1 | MultiSessionRetriever.retrieve: in for loop web_1 | http://httpstat.us/200 web_1 | b'"fast response result"\n' web_1 | 2021-02-04 02:28:24.459158 web_1 | web_1 | MultiSessionRetriever.output: web_1 | http://httpstat.us/200 web_1 | b'"fast response result"\n' web_1 | 2021-02-04 02:28:24.459206 web_1 | web_1 | MultiSessionRetriever.retrieve: after loop web_1 | 2021-02-04 02:28:24.459482 web_1 | result1 web_1 | [b'"fast response result"\n', b'"slow response result"\n'] web_1 | result2 web_1 | [b'"slow response result"\n', b'"fast response result"\n'] web_1 | Parallel resp = [b'"slow response result"\n', b'"fast response result"\n'] Other attempts with Eventlet and Concurrent Futures def parallel_request(url): fastest_result = None try: import datetime import eventlet from eventlet.green.urllib.request import urlopen # urls = ["http://www.google.com/intl/en_ALL/images/logo.gif", # "https://www.python.org/static/img/python-logo.png", # "http://us.i1.yimg.com/us.yimg.com/i/ww/beta/y3.gif"] urls = ['http://httpstat.us/200?sleep=5000','http://httpstat.us/200'] def fetch(url): print("\n Fetch start") print(url) print(datetime.datetime.now()) result = urlopen(url).read() print("\n Fetch result") print(result) print(datetime.datetime.now()) return result pool = eventlet.GreenPool() print("\n Parallel start") print(datetime.datetime.now()) for body in pool.imap(fetch, urls): print("\n Pool result") print(body) print(datetime.datetime.now()) print("\n Parallel end") print(datetime.datetime.now()) except Exception as e: print(e) print("Fastest result= {}".format(fastest_result)) Futures def request_futures(url): try: import datetime import concurrent.futures import urllib.request urls = ['http://httpstat.us/200?sleep=5000','http://httpstat.us/200'] print("\n Start Futures") print(datetime.datetime.now()) # Retrieve a single page and report the URL and contents def load_url(url, timeout): with urllib.request.urlopen(url, timeout=timeout) as conn: print("\n load url") print(datetime.datetime.now()) result = conn.read() print(result) print(datetime.datetime.now()) return result # We can use a with statement to ensure threads are cleaned up promptly with concurrent.futures.ThreadPoolExecutor() as executor: # Start the load operations and mark each future with its URL future_to_url = {executor.submit(load_url, url, 60): url for url in urls} for future in concurrent.futures.as_completed(future_to_url): print("\n Iterate future") print(datetime.datetime.now()) url = future_to_url[future] try: print("\n Try future") print(url) print(datetime.datetime.now()) data = future.result() print("\n Data future") print(data) print(datetime.datetime.now()) except Exception as exc: print('%r generated an exception: %s' % (url, exc)) else: print('%r page is %d bytes' % (url, len(data))) print("\n End Futures") print(datetime.datetime.now()) except Exception as e: print(e) [Less]