Django community: RSS
This page, updated regularly, aggregates Django Q&A from the Django community.
-
Django constraint on derived fields
I am trying to add a non-overlaping constraint on a Postgres specific field DateRange field: class MyModel(models.Model): timespan = DateRangeField(blank=False, null=False) status = models.CharField(max_length=8, choices=status_choices, blank=False, null=False) class Meta: constraints = [ ExclusionConstraint( name='exclude_overlapping_offer', expressions=[ ('timespan', RangeOperators.OVERLAPS), (models.Q(status='ACCEPTED'), RangeOperators.EQUAL), ], ) Basically I want to prevent having overlapping entries in the database that have status ACCEPTED. The migration runs fine, but then when I try to save the model, I get an error: AttributeError: 'Q' object has no attribute 'replace_expressions' There is a reply on a bug report that says that Q objects are not allowed in the expressions of the constraint: https://code.djangoproject.com/ticket/34805 Is there some other way to have the constraint on a derived field? -
Is there a more efficient way to handle dynamic field creation in Django REST Framework serializers?
I am working with Django REST Framework and trying to dynamically add fields to a serializer based on data selected from the UI. Specifically, I need to add fields for each commodity associated with a KindTournamentSerializer instance. These fields should represent the quantity of each commodity. from rest_framework import serializers class KindTournamentSerializer(CelestialSerializer): voucher_id = serializers.CharField(label=_("Ration number")) voucher_size = serializers.CharField(label=_("Ration size")) def __init__(self, *args, **kwargs): super().__init__(*args, **kwargs) # Dynamically determine commodities from the instance if self.instance: selected_fields = kwargs.get('selected_fields', []) if 'voucher_size' in selected_fields: # Get the first instance to extract commodities first_instance = self.instance[0] if isinstance(self.instance, TournamentSerializer) else self.instance commodities = first_instance.rations.all() # Create fields for each commodity for ration in commodities: commodity_name = ration.name sanitized_name = commodity_name.replace(" ", "_").replace("-", "_") field_name = f"commodity_{sanitized_name}" # Add the SerializerMethodField commodity_field = serializers.SerializerMethodField(label=_(commodity_name)) self.fields[field_name] = commodity_field # Dynamically bind the method for the field setattr(self, f'get_{field_name}', lambda obj, name=commodity_name: self.get_commodity_value(obj, name)) def get_commodities_data(self, obj): """Aggregate quantities of commodities from the latest voucher.""" latest_voucher = obj.latest_voucher commodity_data = {} if latest_voucher: commodities = latest_voucher.commodities.all() for commodity in commodities: name = commodity.name quantity = commodity.quantity if name not in commodity_data: commodity_data[name] = 0 commodity_data[name] += quantity return commodity_data def get_commodity_value(self, obj, commodity_name): """Retrieve the … -
Django-filer. Sorting files in the admin panel
Is there any way to change the sorting of files in admin panel by download date? So that the most recently downloaded files are always at the top of the list. I didn't find any suitable settings in the django-filer documentation https://django-filer.readthedocs.io/ -
How do I create clickable text as a URL in Django Admin?
I have a Django REST API app, and I'm trying to create a clickable link from text. So far, I've only found solutions where the URL itself is clickable, and that’s how the current functionality works. However, what I want to achieve is having some text (for example, "ad"), and then having a separate field for the URL. When a user clicks on the text ("ad"), they should be redirected to the URL provided in the other field. So I have in model this two fields: cites = models.CharField(max_length=100, verbose_name="Cites", blank=True) cites_url = models.URLField(max_length=200, verbose_name="Cites URL", blank=True) And in admin.py: from django.utils.html import format_html class AnimalAdmin(admin.ModelAdmin): actions = ['export_to_excel'] inlines = [AnimalImageInline, AnimalFileInline] fields = [ 'cites', 'cites_url', ' ] readonly_fields = ['img_preview', 'klasse_name'] autocomplete_fields = ['category'] list_display = ('cites_link') def cites_link(self, obj): # Return a clickable link with the text from cites and URL from cites_url if obj.cites_url and obj.cites: return format_html('<a href="{}" target="_blank">{}</a>', obj.cites_url, obj.cites) elif obj.cites_url: return format_html('<a href="{}" target="_blank">Click here</a>', obj.cites_url) return "No CITES URL provided" cites_link.short_description = "CITES Link" In Django Admin, you can enter text like "ad" in one field and a URL like https://www.ad.nl in another field. However, in the current version, the … -
request from Django to djangorestframework from the same Docker Container timeouts
I have 2 containers with docker compose: services: web: container_name: web build: context: . dockerfile: Dockerfile command: bash -c "python manage.py makemigrations && python manage.py migrate && python manage.py collectstatic --no-input && gunicorn mysite.wsgi:application --bind 0.0.0.0:8000" volumes: - .:/app - static:/app/static env_file: - .env ports: - "8000:8000" nginx: build: ./nginx volumes: - static:/app/static ports: - "80:80" depends_on: - web here is the nginx config: upstream django { server web:8000; } server { listen 80; location / { proxy_pass http://django; proxy_set_header X-Forwarded-Proto $scheme; } location /static/ { alias /app/static/; } # Increase client max body size to 50M client_max_body_size 50M; } In the web container I am running django app where is also djangorestframework. If I try the API endpoints with postman it works fine. Problem is when from Django view I try call the API endpoint it timeouts. Any idea what is wrong? Thank you for you time. -
Can't create two sessions- one fails one works
I have a post request in django views from which I am calling a selenium webdriver, I am running it on linux I have set the DISPLAY in the os environment Now when I call the request it does open the selenium browser normally but when the driver is still open and i call the request again the selenium browser cant create the session DJANGO uses wsgi application. Switching to asgi may not work. I was expecting to have two drivers open but it fails on the second one. Threading is not an issue here class Test(generics.CreateAPIView): def create(self, request, *args, **kwargs): searchDict = dict(request.data) profile = searchDict['profile'] threading.Thread(target=self.runDriver, args = (profile,)).start() return Response(status=status.HTTP_200_OK) def runDriver(self, profile): br = Driver() useragent = 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36 Edg/121.0.0.0' profile_name = profile username = 'samsan' password = '##1234sAMIUL.' br.createUCDriver(useragent, profile_name) dr = br.ucDriver dr.get('https://www.google.com/') time.sleep(10) dr.get(dr.current_url) This is the Driver class class Driver: def __init__(self): self.driver = None self.ucDriver = None self.waitTime = 10 # self.logger = Logger() # self.executable = input("Please enter chrome executable path: ") self.executable = "seleniumDriver/chromedriver-mac-arm64/chromedriver" def createUCDriver(self, user_agent=None, profile_name='Profile 2'): # os.environ['DISPLAY'] = ':10.0' options = webdriver.ChromeOptions() if user_agent: … -
How to access image files sent usingajax in django backend using
I'm having a problem with sending files from ajax to django. I collect template form using new formData() and append additional data to it, then i try to submit using ajax. On submission, i print out the submitted data in web console and i clearly see keys corresponding to dictionary containing the names of chosen images (images collected in form using input tag) and i also see other string data but when it reaches django end i print json.loads(request.body) and i only see the string data while the keys that corresponded to images are now empty dictionaries . And also request.FILES is empty. js code function accept(formdata){ const data1=Object.fromEntries(formdata.entries()); console.log(data1);//prints all content and i see image names in it $.ajax( {type:'POST', url:document.getElementById('url').getAttribute('data-url'), data:JSON.stringify({"formdata":data1}), headers:{ "X-Requested-With":"XMLHttpRequest", "X-CSRFToken":getCookie("csrftoken"), }, success:(data) =>{ console.log('data'); }, dataType:'json', contentType:false, processData:false } ); } function set_table(event){ event.preventDefault(); const table = document.getElementById('tb'); const list=table.children; console.log(list.length); let arr=[]; for (let row =0; row<list.length;row++ ){ let rows=list[row]; if (rows.getAttribute('name') !='head'){ let data={ "agent":rows.querySelector('#personnel').value, "number":rows.querySelector('#number').value , "work":rows.querySelector('#for').value }; arr.push(JSON.stringify(data)); } } //console.log(event.target.data); const formdata=new FormData(event.target); formdata.append('tab',arr); accept(formdata); } django views def project_creation_page(request): context={"cat":project_cat} is_ajax = request.headers.get('X-Requested-With') == 'XMLHttpRequest' if is_ajax: print(34) if request.method=='POST': data=json.loads(request.body) print(data) for file ,files in request.FILES: print(file, … -
JSON Serialize for Following User
I have model.py as below. class User(AbstractUser): pass def serialize(self): return { "userid" : self.user.id, "username": self.username } def __str__(self): return f"{self.id} {self.username}" class Profile(models.Model): id = models.AutoField(primary_key=True) user = models.OneToOneField(User, on_delete=models.CASCADE, related_name="user_profile") follower = models.ManyToManyField(User, blank=True, related_name="following") def __str__(self): return f"{self.user.username}" def serialize(self): return { "user_id": self.user.id, "user": self.user.username, "followers": self.follower.count(), "followers_usr": [user.username for user in self.follower.all()], "following": self.user.following.count(), "following_usr": [user.username for user in self.user.following.all()] } I would like to create a profile page showing user profile, follower and following count also list of users in follower and following. I'm using return JsonResponse in my views.py. Everything works well until "following_usr". I tried using "following_usr": self.user.following.all() and got this error: TypeError at /profile/1 Object of type QuerySet is not JSON serializable When I try "following_usr": [user.username for user in self.user.following.all()] got this error: AttributeError at /profile/1 'Profile' object has no attribute 'username' What is the proper way to do this? -
is it possible to use the azure blob storage as a postgresql database server in a django project?
I want to apply the PostgreSQL database in a azure blob storage In my Django Project. Because azure blob storage is cheaper than the SQL database on azure. Is technically possible ? -
Django Templates Not Rendering (Only Admin Page Shows)
Problem: My Django project displays only the default admin page when I run the server (python manage.py runserver). None of my custom HTML templates are rendering. What I've Tried: I've reset the URLs and views in all apps, including the main app. Expected Behavior: I expect to see my custom HTML pages displayed when the server starts. -
fastcgi_buffers vs. proxy_buffers when serving Django API using Daphne
I'm running two containers in a docker network (either through docker-compose or AWS ECS): Daphne for ASGI to Django Nginx for forwarding requests to Daphne and serving static files This is working fine. I can request API endpoints, use the Django admin site, load static files, etc. The problem is that some API responses are enormous (>10MiB), and while the endpoints returning these massive responses are very optimized and fast themselves, I'm getting timeouts due to buffering of the the response. I can tell by logs such as: [warn] 28#28: *252 an upstream response is buffered to a temporary file /var/cache/nginx/proxy_temp/1/00/0000000001 while reading upstream, client: 1.2.3.4, server: _, request: "GET /my-app/123/very-big-response/ HTTP/1.1", upstream: "http://172.17.0.2:9000/my-app/123/very-big-response/", host: "app.acme.com", referrer: "https://app.acme.com/" I have spent the past few hours reading about various nginx buffer settings, and while the docs explain fully what the options are and mean, I cannot find clear and reliable information on: Strategies for determining ballpark values for these parameters Which exact nginx directives to use To reiterate, I have two containers: #1 (daphne/django), and #2 (nginx). Container #1 (daphne/django) uses supervisord to run daphne. (Note before I continue: I'm fully aware of some other deviations from best practices here, like … -
Is this possible way to save the token in the existing table and use it for login logout change password forgot password also expirations works fine?
Should use the existing table for token to be stored and also for the reset end forget password token to be stored in the same table with expirations To implement a secure password change mechanism that involves storing JWT tokens in the database for verification, you need to modify the previous solution to save and validate the tokens from the database. Here's a step-by-step guide to implementing this: Step 1: Create a Model for Storing JWT Tokens Create a new model in your Django app to store JWT tokens. This model will include fields for the token, the user it’s associated with, and its expiration status. # models.py in your Django app from django.db import models from django.contrib.auth.models import User from django.utils import timezone class PasswordResetToken(models.Model): user = models.ForeignKey(User, on_delete=models.CASCADE, related_name='reset_tokens') token = models.CharField(max_length=255, unique=True) is_expired = models.BooleanField(default=False) created_at = models.DateTimeField(auto_now_add=True) def __str__(self): return f'Token for {self.user.username}' def has_expired(self): # Check if the token has expired (assuming 30 minutes expiry time) expiry_time = self.created_at + timezone.timedelta(minutes=30) return timezone.now() > expiry_time Step 2: Update the request_password_change View Modify the request_password_change view to generate a JWT token, save it in the database, and send it via email. # views.py in your Django … -
Why we need sync_to_async in Django?
The document said: The reason this is needed in Django is that many libraries, specifically database adapters, require that they are accessed in the same thread that they were created in. Also a lot of existing Django code assumes it all runs in the same thread, e.g. middleware adding things to a request for later use in views. But another question Is it safe that when Two asyncio tasks access the same awaitable object? said python's asyncio is thread safe. And as I know since the GIL still exist accessing one object from multiple thread should be thread safe. Can any one give a minimal example for why we have to use await sync_to_async(foo)() instead of directly foo() in django or other async apps? -
Django GeneratedField as ForeignKey with referential integrity
I'm trying to create a generated field that is also a foreign key to another table, while maintaining referential integrity at the database level. Basically, I'm trying to have the same effect as the following SQL, but in Django CREATE TABLE parent( id TEXT PRIMARY KEY ); CREATE TABLE child( id TEXT PRIMARY KEY, data JSONB, parent_id TEXT GENERATED ALWAYS AS (data->>'parent') STORED REFERENCES parent(id) ); I have successfully managed to create the generated field using Django 5.0 GeneratedField class Parent(models.Model): id = models.TextField(primary_key=True) class Child(models.Model): id = models.TextField(primary_key=True) data = models.JSONField() parnet_id = models.GeneratedField(expression=models.fields.json.KT('data__parent'), output_field=models.TextField(), db_persist=True) Now the problem is: how can I make this field also a foreign key? Because using ForeignKey would create a new column in the database that is not generated. I tried using ForeignObject, since it supports using an existing field as a foreign key, but the foreign key constraint was not created at the database level. class Parent(models.Model): id = models.TextField(primary_key=True) class Child(models.Model): id = models.TextField(primary_key=True) data = models.JSONField() parnet_id = models.GeneratedField(expression=models.fields.json.KT('data__parent'), output_field=models.TextField(), db_persist=True) parent = models.ForeignObject(Parent, from_fields=['parnet_id'], to_fields=['id'], on_delete=models.CASCADE) This generates the following SQL, which does not have a foreign key constraint CREATE TABLE "myapp_parent" ("id" text NOT NULL PRIMARY KEY); CREATE … -
Django ModuleNotFoundError: No module named 'anthropic'
I am trying to use the anthropic api within my django application. I am able to access and make requests to the API within a normal python file, but when I try doing the same within my django app (specifically my views.py file), it does not recognize anthropic as a module I have installed. Was wondering if anyone can help with this. I have tried including anthropic within my settings.py file but this didn't resolve the error. The anthropic docs for python sdk: https://github.com/anthropics/anthropic-sdk-python -
Django Prefetch Related Still Queries
I have the function def selected_location_equipment(self): qs = (Equipment.objects .prefetch_related('service_logs') .all()) return qs That returns a queryset with a few related fields grabbed. The problem is that when i access the prefetched data later in my code, it executes a query again. Ive stepped through the Django code and can see where it is checking the cache for the .all() in one spot and doesnt query, but then when its called here, it's almost like the cache is cleared. Debug Toolbar shows a query for each iteration of the loop as well. for e in equipments: last_service = list(e.service_logs.all())[-1] ... Here's the basic model definition for Equipment class ServiceLog(models.Model): equipment = models.ForeignKey(Equipment, on_delete=models.CASCADE, related_name='service_logs') -
Error: Failed to Clone MapStore2 Submodule in GeoNode Docker Build
Question: I'm trying to build a GeoNode instance using Docker, but I'm encountering an error related to cloning the MapStore2 submodule from the geonode-mapstore-client repository. Below is the error output I'm receiving: bash => [django 12/16] RUN yes w | pip install --src /usr/src -r requirements.txt 225.0s => => # fatal: fetch-pack: invalid index-pack output => => # fatal: clone of 'https://github.com/geosolutions-it/MapStore2.git' into submodule path '/usr/src/django-geonode-mapstore-client/geonode_mapstore_client/client/MapStore2' failed => => # Failed to clone 'geonode_mapstore_client/client/MapStore2'. Retry scheduled Full Error Output: => ERROR [django 12/16] RUN yes w | pip install --src /usr/src -r requirements.txt && yes w | pip install -e . ... 202.5 error: RPC failed; curl 92 HTTP/2 stream 0 was not closed cleanly: CANCEL (err 8) ... 319.8 note: This error originates from a subprocess, and is likely not a problem with pip. Dockerfile Snippet: dockerfile Copier le code FROM geonode/geonode-base:4.1.0-ubuntu-22.04 LABEL GeoNode development team # Copy local GeoNode src inside container COPY /src/. /usr/src/geonode/ WORKDIR /usr/src/geonode # Configurer Git pour augmenter la mémoire tampon et gérer les faibles vitesses de téléchargement RUN git config --global http.postBuffer 524288000 && \ git config --global http.lowSpeedLimit 0 && \ git config --global http.lowSpeedTime 999999 RUN yes w | pip install … -
Wagtail: How to validate page model relations before saving
I have a page type where I want to define the title (and slug) automatically, based on some other fields and higher level model relations. To achieve that, I override the full_clean() method of my page class. The only thing that can go wrong is that the new page gets a slug that is already in use among the sibling pages. That is intended, only pages with unique field combinations that will influence the slug should exist. So, if a user tries to save a page with a duplicate combination of data fields, I want to display a nice and readable ValidationError. I understand that the full_clean() method is called multiple during editing/saving pages, following a kind of hierarchical approach, where the cleaning procedure starts with basic stuff and goes up to model relations. It seems that ValidationErrors are only caught in the UI and displayed nicely when they are not raised in the presumably last call of full_clean(), right after hitting the save button. When I raise a ValidationError when I have all the information at hand, it's not caught and the traceback is shown. Is there any way to handle a ValidationError gracefully if I only can raise … -
Django - Multiple images upload and delete
I'm looking at improving my code that allows the user to upload multiple images (files) linked to a record, and also delete them as needed. I am able to handle the first part (multiple images) with the following. models.py class APIvisit(ModelIsDeletable, SafeDeleteModel): _safedelete_policy = SOFT_DELETE created_date = models.DateTimeField(auto_now_add=True,editable=False, verbose_name=_("Créé le")) modified_date = models.DateTimeField(auto_now=True,editable=False, verbose_name=u"Modifié le") visitdate = models.DateField(_('Date de la visite'),default=datetime.date.today) [...] a lot of other fields class Meta: ordering = ('visitdate',) class APIvisitimage(ModelIsDeletable, SafeDeleteModel): _safedelete_policy = SOFT_DELETE created_date = models.DateTimeField(auto_now_add=True,editable=False, verbose_name=_("Créé le")) modified_date = models.DateTimeField(auto_now=True,editable=False, verbose_name=u"Modifié le") fk_visit = models.ForeignKey(APIvisit, on_delete=models.PROTECT, related_name=_('ImageVisite'), verbose_name=_('ImageVisite'), blank=False, null=False) image = models.FileField(upload_to="uploads/%Y/%m/%d/") forms.py from django.forms.widgets import ClearableFileInput class MultipleFileInput(forms.ClearableFileInput): allow_multiple_selected = True class MultipleFileField(forms.FileField): def __init__(self, *args, **kwargs): kwargs.setdefault("widget", MultipleFileInput()) super().__init__(*args, **kwargs) def clean(self, data, initial=None): single_file_clean = super().clean if isinstance(data, (list, tuple)): result = [single_file_clean(d, initial) for d in data] else: result = single_file_clean(data, initial) return result [...] class VisitImageForm(ModelForm): class Meta: model = APIvisitimage fields = ["image"] image = MultipleFileField(label='Choisir les photos', required=False) views.py class VisitEditView(PermissionRequiredMixin, UpdateView): permission_required = 'gestion.change_apivisit' model = APIvisit form_class = VisitForm template_name = 'visite/edition.html' success_url = '/visite/' def form_invalid(self, form): self.object_list = self.get_queryset() context = self.get_context_data(task_form=form) return self.render_to_response(context) def post(self, request, *args, **kwargs): self.object = self.get_object() … -
Django reusable Many-to-one definition in reverse
I'm struggling to make a Many-to-one relationship reusable. Simplified, let's say I have: class Car(models.Model): ... class Wheel(models.Model): car = models.ForeignKey(Car) ... Pretty straight forward. What, however, if I'd like to use my Wheel model also on another model, Bike? Can I define the relationship in reverse, on the "One" side of the relationship? Defining this as a Many-to-many on the Vehicles would mean the same Wheel could belong to multiple Vehicles, which is not what I want. Would I have to subclass my Wheel to CarWheel and BikeWheel only to be able to differentiate the Foreignkey for each relationship? Seems like there should be a cleaner solution. -
Django Multi-Database Setup: Error when Saving to Database
I am working on a Django project where I am using multiple databases, and I need to fetch data from a read-only MySQL database. I am using the using() method in Django to query data from this read-only database. However, when I attempt to save the queryset, I encounter the following error: The MySQL server is running with the --read-only option so it cannot execute this statement I have reviewed the Django documentation on multiple databases (link below), which suggests that specifying the using() method should allow for reading from and writing to different databases, but in this case, I'm only trying to read from the read-only database and save to another. Django documentation: https://docs.djangoproject.com/en/3.1/topics/db/multi-db/#:~:text=If%20you%20don%E2%80%99t%20specify%20using%2C%20the%20save Am I missing something here, or is there an additional configuration needed when dealing with a read-only MySQL database in Django? -
getting error while handling SSE in django
I have created a notification master in models.py then in signals.py i created event for post ssave and delete of notification while event is triggered in views.py iam creatin sse_view whose url is getting access infrontend. this url is raising the issue "Application instance <Task pending name='Task-21' coro=<ASGIStaticFilesHandler._call_() running at C:\Users\nihar\envs\backendenvt\Lib\site-packages\django\contrib\staticfiles\handlers.py:101> wait_for=<Future pending cb=[shield.<locals>._outer_done_callback() at C:\Users\nihar\AppData\Local\Programs\Python\Python311\Lib\asyncio\tasks.py:898, Task.task_wakeup()]>> for connection <WebRequest at 0x29eef9ae8d0 method=GET " i installed daphne, twisted, whitenoise in settings.py - INSTALLED_APPS = ['daphne'] MIDDLEWARE = ['whitenoise.middleware.WhiteNoiseMiddleware'] ASGI_APPLICATION = 'BACKEND.asgi.application' 3.view.py code- async def update_event_notifications_sse_view(request): mail_id = request.GET.get('mail_id') permission = AsyncNotificationAccessPermission() token = request.GET.get('token') if not await sync_to_async(is_valid_token)(token, mail_id): raise PermissionDenied("You do not have permission to access this resource.") if not await permission.has_permission(request, view=None): raise PermissionDenied("You do not have permission to access this resource.") async def event_stream(): yield f"data: {json.dumps(await notifications_list(mail_id))}\n\n" while True: event_occurred = await sync_to_async(update_event_notification.wait)() if event_occurred: try: yield f"data: {json.dumps(await notifications_list(mail_id))}\n\n" await sync_to_async(update_event_notification.clear)() # Clear the event flag except Exception as e: print(f"Error in event stream: {e}") break await asyncio.sleep(60) response = StreamingHttpResponse(event_stream(), content_type='text/event-stream') response['Cache-Control'] = 'no-cache' return response async def notifications_list(mail_id): if mail_id is not None: # Fetch notifications using sync_to_async queryset = await sync_to_async(lambda: Notifications.objects.filter(ToUser__MailID=mail_id).order_by('-CreatedAt'))() serialized_data = await sync_to_async(lambda: NotificationSerializer(queryset, many=True).data)() return serialized_data -
Getting output from joined tables with Django
I have the following example tables in MySQL DB being interacted with Django class Env(models.Model): name = models.CharField() class ExecutionDetail(models.Model): executionid= models.ForeignKey(Execution) job= models.CharField() class Execution(models.Model): name= models.ForeignKey(Env) executionid= models.CharField() envname= models.ForeignKey(Env) I select data using views def v_job_history(request, job): logger.debug("Calling history jobs") jobname=job myjobs = ExecutionDetail.objects.filter(job=job) template = loader.get_template('history_jobs.html') context = { 'myjobs': myjobs, } return HttpResponse(template.render(context, request)) Then in my HTML I try and display my data, e.g. {% for x in myjobs %} <tr> <td>{{ x.execution.envname}} </a></td> <td>{{ x.execution.name }} </a></td> <td>{{ x.job}}</td> </tr> {% endfor %} The problem is x.execution.env_name will return Environment object (2) etc. I have tried x.execution.env_name - returns objects. x.env.name, x.execution.env.name which return nothing. -
Django max_length validation for BinaryField causes KeyError in translation __init__.py
I have a simple model, something like class Notenbild(models.Model): bild_data = models.BinaryField(max_length=500000, editable=True) In admin.py class BinaryFieldWithUpload(forms.FileField): def __init__(self, *, max_length=None, allow_empty_file=False, **kwargs): super().__init__(max_length=max_length, allow_empty_file=allow_empty_file, **kwargs) def to_python(self, data): data = super().to_python(data) if data: image = Image.open(data) # some more processing with the image which I omitted here byte_array = io.BytesIO() image.save(byte_array, format='PNG') return byte_array.getvalue() return None def widget_attrs(self, widget): attrs = super().widget_attrs(widget) if isinstance(widget, FileInput) and "accept" not in widget.attrs: attrs.setdefault("accept", "image/*") return attrs @admin.register(Notenbild) class NotenbildAdmin(admin.ModelAdmin): fields = [ 'bild_data', 'vorschau', ] readonly_fields = ['vorschau'] formfield_overrides = { models.BinaryField: {'form_class': BinaryFieldWithUpload}, } @admin.display(description='Bild (Vorschau)') def vorschau(self, notenbild: Notenbild): encoded_image = base64.b64encode(notenbild.bild_data).decode('utf-8') return format_html( f'<p>{len(notenbild.bild_data)} bytes</p>' f'<img src="data:image/png;base64,{encoded_image}" style="max-width:40rem; max-height:16rem" />' ) which works fine for saving images through the admin interface, which fit the size limits. However, when trying to save a file which exceeds the size limit, I get a very strange error: KeyError at /admin/library/notenbild/368/change/ "Your dictionary lacks key 'max'. Please provide it, because it is required to determine whether string is singular or plural." Django Version: 5.1.1 Exception Location: /Users/alex/Repositories/ekd-cms/venv/lib/python3.12/site-packages/django/utils/translation/__init__.py, line 130, in _get_number_value which doesn't make any sense to me. The validation correctly caught the invalid data, but seems like this might be a … -
Annotating a Django QuerySet with the count of a Subquery
I'm building a job board. Each Job could have several associated Location objects. I have designed my Location and Job models as follows: class Location(BaseModel): slug = models.CharField(unique=True) city = models.OneToOneField(to="City", null=True) state = models.ForeignKey(to="State", null=True) country = models.ForeignKey(to="Country") class Job(BaseModel): title = models.CharField() description = models.TextField() locations = models.ManyToManyField( to="Location", related_name="jobs", through="JobLocation", ) So, a Location object has the flexibility to refer to our idea of a country (like United States), a state (like New York) or a city (like Manhattan). I populate the slug field of Location model like so: If Location object is a country, I use the country name. united-states If Location object is a state, I use the (state name + country name). new-york-united-states If Location object is a city, I use (city name + state name + country name). manhattan-new-york-united-states With slug field populated in this manner, I can simplify querying all the jobs in a particular location using the endswith lookup. For example, if there's a Python job in Manhattan, NY and a React job in Brooklyn, NY, I can get all the jobs in the state of New York like so: Job.objects.filter(locations__slug__endswith="new-york-united-states").distinct() Now, I would like to get a list of all …