Support for Custom Logger in log_queries
and log_responses
#1249
Replies: 3 comments
-
It's a nice idea. At the moment it looks like we just print the queries: piccolo/piccolo/engine/postgres.py Lines 543 to 544 in 448a818 The reason is because we print out formatted text: piccolo/piccolo/engine/base.py Lines 221 to 223 in 448a818 If we used Python's logging library instead, you should be able to set up some kind of handler or filter which specifies how to handle those logs. I can't remember off the top of my head how it's done, but it's possible. |
Beta Was this translation helpful? Give feedback.
-
We could pass logger object and could do something like this: def print_response(self, query_id: int, response: t.List):
if self.logger:
self.logger.debug(f"\nQuery {query_id} response: {pprint.pformat(response)}")
else:
print(
colored_string(f"\nQuery {query_id} response:", level=Level.high)
)
pprint.pprint(response) |
Beta Was this translation helpful? Give feedback.
-
Hello I did had the same issue, and I propose this solution for people using Postgres. In reality, To integrate it, you can subclass the piccolo """
Custom PostgresEngine to log queries.
Usage example:
db = PostgresDriver(
config={
"host": db_host,
"port": db_port,
"database": db_name,
"user": db_username,
"password": db_password,
},
)
"""
from typing import TYPE_CHECKING
from typing import Any
import logging
import asyncpg
from piccolo.engine.postgres import PostgresEngine
if TYPE_CHECKING:
from asyncpg import Connection
from asyncpg import LoggedQuery
logger = logging.getLogger("asyncpg.query")
class PostgresDriver(PostgresEngine):
async def get_new_connection(self) -> "Connection":
"""Returns a new Postgres connection with a predefined query logger.
This method doesn't retrieve it from the pool.
Returns:
Connection: New Postgres connection.
"""
conn = await super().get_new_connection()
conn.add_query_logger(self.log_query)
return conn
async def start_connection_pool(self, **kwargs: Any) -> None:
"""Create a Postgres connection pool with a predefined query logger.
Args:
kwargs (Any): Additional Postgres connection parameters.
"""
if self.pool:
raise RuntimeError(
"A pool already exists - close it first if you want to create "
"a new pool.",
)
config = dict(self.config)
config.update(**kwargs)
self.pool = await asyncpg.create_pool(**config, init=self.init_pool)
async def init_pool(self, conn: "Connection") -> None:
"""A user defined function to customize the Postgres connection in pool
Args:
conn (Connection): Postgres connection.
"""
# Add query logger to the connection.
conn.add_query_logger(self.log_query)
async def log_query(self, query: "LoggedQuery") -> None:
"""Log the `asyncpg` query using the `logger` instance.
Args:
query (LoggedQuery): Query to log.
"""
logger.debug(query.query) With this solution, the database responses will not be logged, but all the databases queries are logged even the
|
Beta Was this translation helpful? Give feedback.
Uh oh!
There was an error while loading. Please reload this page.
-
Currently,
log_queries
andlog_responses
options are helpful for tracking database interactions, but they only log to the console. It would be beneficial to allow users to pass a custom logger so that queries and responses could be directed to specific destinations, such as a file or external logging service.Add an optional
logger
parameter tolog_queries
andlog_responses
configurations, allowing users to pass a customlogging.Logger
instance or another compatible logger like loguru.For example:
With this change, both queries and responses would be logged to
db_logs.log
instead of just printing to the console. This approach would improve flexibility, especially in larger applications where custom logging configurations are needed.Beta Was this translation helpful? Give feedback.
All reactions