How can I make use of structlog in python rq
In my application I am using structlog as a log system. My application also uses PythonRQ. How can I make PythonRQ to use the log system I am already using in my application so that all my application logs follows the same pattern?
do you know?
how many words do you know
See also questions close to this topic
-
Share memory in python-rq jobs
We are using python-rq (redis queue) to enqueue a time intensive job that modifies a very large dictionary. This dictionary is currently passed to the main application by dumping it into job.meta. However, this implies the dictionary being copied every time, which severely impacts performance because of the large size.
Is there an option to share the memory between the main application and the job? The main application only needs to read the dictionary.
-
deserialization error when calling method in python rq distributed queue
I am trying to test how to pass a python object to a rq worker process. I have the following classes in
common.py
class Input: def __init__(self, arr_list: List): self.arr_list = arr_list def compute_sum(self): sum = 0 for num in self.arr_list: sum += num time.sleep(10) return Output(sum) class Output: def __init__(self, sum_result): self.sum_result = sum_result def __str__(self): return str(self.sum_result) def calculate_sum(input_obj: Input) -> Output: return input_obj.compute_sum()
I am calling this method calculate_sum to calculate sum of 100 elements using
rq
.from common import Input, calculate_sum class TestRqJobSharding: def test_job_submission(self): index = 0 batch = 10 list_obj = [1] * 100 jobs = [] results = [] while index < len(list_obj): new_batch = list_obj[index: index + batch] input_obj = Input(new_batch) job = get_queue().enqueue(calculate_sum, args=(input_obj,)) index = index + batch jobs.append(job) results = submit_and_wait_till_completed(jobs) for result in results: print(f'{result}') print(results)
I am seeing the following deserialization error in worker process.
Traceback (most recent call last): File "/home/sshil/venv3.7/lib/python3.7/site-packages/rq/worker.py", line 1056, in perform_job self.prepare_job_execution(job) File "/home/sshil/venv3.7/lib/python3.7/site-packages/rq/worker.py", line 937, in prepare_job_execution self.procline(msg.format(job.func_name, job.origin, time.time())) File "/home/sshil/venv3.7/lib/python3.7/site-packages/rq/job.py", line 297, in func_name self._deserialize_data() File "/home/sshil/venv3.7/lib/python3.7/site-packages/rq/job.py", line 265, in _deserialize_data raise DeserializationError() from e rq.exceptions.DeserializationError Traceback (most recent call last): File "/home/sshil/venv3.7/lib/python3.7/site-packages/rq/job.py", line 262, in _deserialize_data self._func_name, self._instance, self._args, self._kwargs = self.serializer.loads(self.data) ModuleNotFoundError: No module named 'common'
I tried to use Json serializer, I am still seeing this error. This happens when I pass a python object. It works fine with int or string objects.
-
ExtraAdder throws an exception, thinks EventDict is a string?
Trying figure out how to actually use the processor. What I want to do is add a custom dictionary to log messages as per the example under Debug in the standard Python logging library (https://docs.python.org/3/library/logging.html).
It doesn't work, whether I put an extra in or not.
Python 3.9.2 (default, Feb 28 2021, 17:03:44) [GCC 10.2.1 20210110] on linux Type "help", "copyright", "credits" or "license" for more information. >>> import structlog >>> from structlog import stdlib,processors >>> import logging, sys >>> logging.basicConfig(stream=sys.stdout) >>> structlog.configure(logger_factory=stdlib.LoggerFactory(), processors=[stdlib.add_log_level, stdlib.add_logger_name, stdlib.filter_by_level, stdlib.PositionalArgumentsFormatter(), processors.TimeStamper(fmt="iso"), structlog.processors.JSONRenderer(), structlog.stdlib.ExtraAdder(),],) >>> log = structlog.get_logger(__name__) >>> type(log) <class 'structlog._config.BoundLoggerLazyProxy'> # looks like we got a logger # with an extra: >>> log.error("Log message example", extra={'a':'b'}) Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.9/dist-packages/structlog/_log_levels.py", line 124, in meth return self._proxy_to_logger(name, event, **kw) File "/usr/local/lib/python3.9/dist-packages/structlog/_base.py", line 203, in _proxy_to_logger args, kw = self._process_event(method_name, event, event_kw) File "/usr/local/lib/python3.9/dist-packages/structlog/_base.py", line 160, in _process_event event_dict = proc(self._logger, method_name, event_dict) File "/usr/local/lib/python3.9/dist-packages/structlog/stdlib.py", line 708, in __call__ record: Optional[logging.LogRecord] = event_dict.get("_record") AttributeError: 'str' object has no attribute 'get' # without an extra >>> log.error("Log message example") Traceback (most recent call last): File "<stdin>", line 1, in <module> File "/usr/local/lib/python3.9/dist-packages/structlog/_log_levels.py", line 124, in meth return self._proxy_to_logger(name, event, **kw) File "/usr/local/lib/python3.9/dist-packages/structlog/_base.py", line 203, in _proxy_to_logger args, kw = self._process_event(method_name, event, event_kw) File "/usr/local/lib/python3.9/dist-packages/structlog/_base.py", line 160, in _process_event event_dict = proc(self._logger, method_name, event_dict) File "/usr/local/lib/python3.9/dist-packages/structlog/stdlib.py", line 708, in __call__ record: Optional[logging.LogRecord] = event_dict.get("_record") AttributeError: 'str' object has no attribute 'get'
The configuration seems easy enough if I want to allow anything. Actually using this processor in a log message stumps me.
Any help appreciated.
-
how to capture and log to console using CapturingLoggerFactory
I'm struggling to figure out how to use
CapturingLoggerFactory
for my unit test so that I can capture the log and assert on them but also to actually log the information to the console. CapturingLoggerFactory class doesn't log anything, just adds the logs to the list that you can later on assert on.My example code:
class MyClass(): def __init__(self): self.cf = CapturingLoggerFactory() configure(logger_factory=self.cf, processors=[JSONRenderer()]) self.logger = get_logger() self.logger.info("MyClass instantiated")
Now in my tests I have something like this:
def test_log(self) -> None: my_class = MyClass() self.assertEqual(my_class.cf.logger.calls[-1].method_name, "info") log_message = json.loads(my_class.cf.logger.calls[-1].args[0])["event"] self.assertEqual(log_message, "MyClass instantiated")
Now this all works perfectly fine but the actual log "MyClass instantiated" is never printed on the console.
Anyone knows how to print and capture this log?