forked from pool/python-flower
- Remove no longer needed remove-faulty-test.patch
- Use backport_run_in_executor.patch to be compatible with Tornado 4
- Increase minimum dependencies for celery >= 3.1.0 and Tornado >= 4.2.0
- Update to v0.9.3
* Fix numeric sort and sort ordering
* Support filtering tasks by received time
* Fix "not JSON serializable" `TypeError` for /api/task/info
* Fixed Auth redirect when -url_prefix is given
* OpenAPI 2.0 swagger spec initial commit
* Update tasks datatable to use POST method
* Fix hanging issues with tornado 5.0.0
* Add 'signal' query parameter to endpoint /api/task/revoke
* Use parse_version instead of tuple comparisons
* /tasks endpoint fails with 500
* Add links for parent and root jobs
* Make dropdown font colour white instead of grey
* Fix incorrect response body
* Removed some legacy code for Python 2.6
* Document the GetQueueLengths API endpoint
* Enable broker support for redis+socket connections
* Fix docs for default inspect_timeout value
* Fix typo in Google OAuth 2 redirect uri env variable
* Upgrade tornado
* Fix the out-dated doc link in Worker.html
* Declare futures dependency using environment markers
* Fixing GitHub OAuth callback handling
* Correct spelling error
* Correct spelling and grammatical errors
* Switch away from Google+ OAuth2
* restrict release dependencies with version upper-bound
- from v0.9.2
* Add logout button
* Fix a bug in humanizing timestamps on the tasks page
* Handle errors in custom format_task functions
* Pending tasks don't have a worker
* Fix result encoding
* Removed 500px height limitation
* Do not show debug information on 500 error
* Fixed Python 3 mistake in Github Login Handler
* Support environment variables in tornado 4.2.0 and >=4.3
* Fix XSS on tasks page
* Enable cross-origin websockets
* Fix bugreport
* Resolve relative config file names
* Fix broker api validation
* Loosen broker api validation
* Replace websokets with ajax
* Fix bug in tasks page template
* Fix celery version comparsion
* Refactor version comparision
* Fixes in config documents
* Fix invalid URL used for datatable query
* Use redis as default broker
* Update dashboard counters on worker table updates
* Move shut down group button to worker page
* Update navbar title
* Add a script for calling tasks
* Add a link to worker name
* Remove active task start time from worker page
* Fix py3 json serialization
* Fix Error 500 because task.worker is None
* HTML escape task args
* Disable broker cert verification
* Enable all tasks columns with --tasks-options=all
* Add runtime to default tasks columns
* Document --tasks-columns=all
* Refactor redis broker
* Support CELERY_QUEUES option
* Remove rabbitmq-plugins enable warning
* Fix broken link to celery configuration document
- from v0.9.1
* Improve envvar handling
* By default update worker cache
* Fix task sorting for py3
* Fix missing workername
* Fix monitor tab problem of missing graphs
- from v0.9.0
* workers can be sorted and filtered
* tasks can be sorted, filtered by name, state, worker, runtime, etc.
* tasks columns can be reordered and customized
* tasks columns for worker, retries, revoked, expires, eta, etc.
* pagination of tasks
* GitHub Auth support
--max_workers option for limiting the number of workers
--unix_socket option for running with unix socket
* bug fixes
OBS-URL: https://build.opensuse.org/request/show/687482
OBS-URL: https://build.opensuse.org/package/show/devel:languages:python/python-flower?expand=0&rev=4
49 lines
1.6 KiB
Diff
49 lines
1.6 KiB
Diff
From 5741cbcbc5c2a75c2552326018ee97b8fe5f257f Mon Sep 17 00:00:00 2001
|
|
From: John Vandenberg <jayvdb@gmail.com>
|
|
Date: Fri, 22 Mar 2019 08:28:34 +0700
|
|
Subject: [PATCH] Backport run_in_executor
|
|
|
|
---
|
|
flower/api/tasks.py | 21 +++++++++++++++++++--
|
|
1 file changed, 19 insertions(+), 2 deletions(-)
|
|
|
|
diff --git a/flower/api/tasks.py b/flower/api/tasks.py
|
|
index 1f172422..f0395dea 100644
|
|
--- a/flower/api/tasks.py
|
|
+++ b/flower/api/tasks.py
|
|
@@ -78,6 +78,24 @@ def safe_result(self, result):
|
|
return result
|
|
|
|
|
|
+def inline_run_in_executor(func, *args):
|
|
+ from tornado.concurrent import Future, chain_future
|
|
+
|
|
+ io_loop = IOLoop.current()
|
|
+ if not hasattr(io_loop, "_executor"):
|
|
+ import concurrent.futures
|
|
+ from tornado.process import cpu_count
|
|
+
|
|
+ io_loop._executor = concurrent.futures.ThreadPoolExecutor(
|
|
+ max_workers=(cpu_count() * 5)
|
|
+ )
|
|
+ executor = io_loop._executor
|
|
+ c_future = executor.submit(func, *args)
|
|
+ t_future = Future()
|
|
+ io_loop.add_future(c_future, lambda f: chain_future(f, t_future))
|
|
+ return t_future
|
|
+
|
|
+
|
|
class TaskApply(BaseTaskHandler):
|
|
@web.authenticated
|
|
@gen.coroutine
|
|
@@ -138,8 +156,7 @@ def post(self, taskname):
|
|
result = task.apply_async(args=args, kwargs=kwargs, **options)
|
|
response = {'task-id': result.task_id}
|
|
|
|
- response = yield IOLoop.current().run_in_executor(
|
|
- None, self.wait_results, result, response)
|
|
+ response = yield inline_run_in_executor(self.wait_results, result, response)
|
|
self.write(response)
|
|
|
|
def wait_results(self, result, response):
|