Django community: RSS
This page, updated regularly, aggregates Django Q&A from the Django community.
-
How to check if a django queryset has a matching manytomany + foreignkey?
I try to check - in Djnago - if a user made queryset has two fields out of three which match 100%. class Foo(models.Model): # ... free_field = models.ForeignKey(FreeModel, ...) must_match_m2m_field = models.ManyToManyField(ManyModel, ...) must_match_fk_field = models.ForeignKey(BarModel, ...) # ... So, user generates a queryset of the "Foo" model (with several objects in it), which may contain many objects with a) different FreeModels, and b) precisely matching ManyModels & BarModels. In backend I need to check that all ManyModels and BarModels match in the query. for the ForeignKey objects I made following (I think its fair enough): def check_same_objects(object_list): return len(set(object_list)) == 1 check_bars = check_same_objects(qs_foo.values_list('must_match_fk_field', flat=True)) I made this for comparing the lists of objects in the manytomany fields of each object: def check_same_lists(object_lists): for lst in object_lists: if set(lst) != set(object_lists[0]): return False return True should_match_foos = [] for e in qs_foo: should_match_foos.append(e.must_match_m2m_field.all().values_list('id', flat=True)) check_manies = check_same_lists(should_match_foos) # finally checking both 'must-match-fields': check_both = all([check_bars, check_manies]) What is the more elegant pythonic/djnago variant of checking if theese two fields match 100% in the whole qs? -
Django Channels WebSocket connects but immediately closes (Live chat app)
I am building a live streaming website where users can watch a live class and interact via WebSocket chat. I have integrated Django Channels, Redis, and HLS.js for video playback. The chat connects via a WebSocket to Django Channels. However, the issue is: The WebSocket connects successfully, but immediately closes afterward. The browser console shows multiple attempts to reconnect, but the problem persists. There is no clear error message except "WebSocket connection closed" in the JavaScript console. My Setup Django 5.1.8 channels channels-redis Redis server is running (redis-cli ping returns PONG) Development server (local machine, runserver + daphne for Channels) Code Snippets HTML/JS Side (live_stream.html) <script> // WebSocket Chat functionality const streamId = "UDOOSDIHOH49849"; // Your stream_id let chatSocket = null; let reconnectAttempts = 0; const maxReconnectAttempts = 5; // You can fetch real username from Django template if available const username = "Anonymous"; // 🔥 You can dynamically change this later function connectWebSocket() { chatSocket = new WebSocket( 'ws://' + window.location.host + '/ws/chat/' + streamId + '/' ); chatSocket.onopen = function(e) { console.log('WebSocket connection established'); reconnectAttempts = 0; }; chatSocket.onmessage = function(e) { const data = JSON.parse(e.data); const chatBox = document.getElementById('chatMessages'); const newMessage = document.createElement('p'); newMessage.innerHTML = `<strong>${data.username}:</strong> ${data.message}`; … -
Product visible in Admin but not returned from database queries
I'm working on a Django project and I'm seeing a strange issue. There's a product that appears in the Django admin, but when I run queries (raw or Django ORM), it's not returned at all, as if it doesn't exist. I am using MySQL database. I checked the database and the object isn't saved there. -
How to use pagination with djangochannelsrestframework?
I'm using djangochannelsrestframework for my project and want to use Pagination. I found the PaginatedModelListMixin. This is my consumer: class UserConsumer(GenericAsyncModelAPIConsumer): queryset = User.objects.all() serializer_class = UserSerializer pagination_class = WebsocketPageNumberPagination @model_observer(User) async def user_activity(self, message, observer=None, **kwargs): await self.send_json(message) @user_activity.serializer def user_activity_serializer(self, instance, action, **kwargs): return { "action": action.value, "data": UserSerializer(instance).data, } async def connect(self): await self.accept() await self.user_activity.subscribe() The GenericAsyncModelAPIConsumer is just a wrapper for all the CRUD mixins class GenericAsyncModelAPIConsumer( PaginatedModelListMixin, CreateModelMixin, UpdateModelMixin, RetrieveModelMixin, DeleteModelMixin, GenericAsyncAPIConsumer, ): pass The WebsocketPageNumberPagination should be a wrapper for the rest_framework's PageNumberPagination, but it didn't work for me. I send the request with a js WebSocket like this: class ModelWebSocket extends WebSocket { items = reactive([]) constructor(url, protocols, pk = 'id') { // Call the parent constructor super(url, protocols) // List all items when the connection is opened this.onopen = () => { console.debug('[WS] Connected') this.list() } // Handle incoming messages this.onmessage = (event) => { const message = JSON.parse(event.data) console.log('[WS] Message', message) // Some more stuff, but the message is the interessting } // Close and error handling // ... } list() { return new Promise((resolve, reject) => { const requestId = this.#getAndSetPendingRequest(resolve, reject) this.send( JSON.stringify({ action: 'list', request_id: requestId, page_size: … -
Using Asych in Django views to connect to LiveKit Backend throws missing arguments
I'm new to asynch side of django rest framework. I currently have a django rest api with Django v5 with all functions written in synchronous views. However I'm attempting to add a webrtc calling feature using the Livekit server.I'm attempting to connect my django rest api to LiveKit Server(self hosted on ubuntu 22.04) using this documentation(https://github.com/livekit/python-sdks) to create a room before connecting. The documentation clearly states RoomService uses asyncio and aiohttp to make API calls. It needs to be used with an event loop. Here is my code for the same : # Creating a room # RoomService uses asyncio and aiohttp to make API calls. It needs to be used with an event loop. async def createLiveKitRoom(self, request): request_data = request.data.dict() serializer = CreateLiveKitRoomSerializer(data=request_data) serializer.is_valid() data = serializer.validated_data room_uuid = data.get("room_uuid") # Will read LIVEKIT_URL, LIVEKIT_API_KEY, and LIVEKIT_API_SECRET from environment variables lkapi = LiveKitAPI( "http://${nginx_sfu_media_server_intra_chat_ip}:${sfu_media_server_intra_i_chat_port}" ) room_info = await lkapi.room.create_room( CreateRoomRequest(name=room_uuid, empty_timeout=10 * 60, max_participants=20) ) print(room_info) await lkapi.aclose() return room_info asyncio.run(createLiveKitRoom()) I first have to create a room_uuid on my django end using the usual put synchronous call(which I already have) and pass this room_uuid to the above asynch call so that the room is created on livekit … -
I am trying to import categories but lines that have a parent is causing an issue
I have an empty product category table and I am trying to import categories from a csv file and I can import other categories for other apps but not for products so I had the csv file checked and I'm told it's fine. I get the following errors and they all start at the line that has a parent. Line number: 29 - ProductCategory matching query does not exist. Line number: 30 - ProductCategory matching query does not exist. Line number: 31 - ProductCategory matching query does not exist. Line number: 32 - ProductCategory matching query does not exist. Line number: 33 - ProductCategory matching query does not exist. If it's a new import why is it looking for an existing category? If I remove all the lines that has a parent then I can import and after importing and putting the lines back that has a parent I get the same error again, which now does not make sense. Here is my admin.py file: from django.contrib import admin from .models import * from import_export import resources from import_export.admin import ImportExportModelAdmin class CategoryResource(resources.ModelResource): class Meta: model = ProductCategory fields = ("id", "name", "description", "parent") class ProductCategoryAdmin(ImportExportModelAdmin): list_display = ("name", "id", … -
Dynamic initial values for a Django Model
How would i go about prepopulating a field in the Wagtail Admin UI with HTTP GET parameters? I have seen some solutions using the deprecated contrib.ModelAdmin, but have not really found something using the new ModelViewSet. My simplified use case would be a simple calender in the Admin UI (using some javascript calender like fullcalandar.js) where i would create a new Event by dragging a timeframe and having the browser visit an url like /admin/event/new?start=startdate&end=enddate showing the Event Form with the start and end fields being prepoulated by the timeframe. I have the following model class Event(models.Model): title = models.CharField(max_length=255) [...] class EventOccurrence(Orderable): event = ParentalKey(Event, on_delete=models.CASCADE, related_name='event_occurrence') start = DateTimeField() end = DateTimeField() So far i have tried to use an Custom Form Class inherting from WagtailAdminModelForm, which works nicely for the prepopulating, but i have no way to access the request object to fetch the GET paramters. Helpful AIs would like me to use the deprecated ModelAdmin or inject some javascript to prepopulate the fields on the frontend. My personal hail mary would be to create the event via an API and the just refer the user to the freshly created event, but i would like to avoid … -
Translate external lib text Django
I currently use Base64ImageField from drf_extra_fields in a serializer. This class uses INVALID_FILE_MESSAGE = _("Please upload a valid image.") and _ comes from from django.utils.translation import gettext_lazy as _ I implemented translations (from english to french) with pyhton manage.py makemessages and pyhton manage.py compilemessages to us them. But to translate the error message from Base64ImageField, i did not found any relevant solutions: Write the translation myself in django.po file, but as soon as i make ```python manage.py makemessages`` it's going to be commented out. Avoid my custom translation being commented out by having a file.py with _("Please upload a valid image.") Write a custom makemessages command to avoid translation being commented out Is there any other solution or only "dirtry tricks" work for this issue ? -
Overriding Collectstatic and Manifestic file storage
I am trying to integrate terser js minification in a django collectstatic command , by overriding the CompressedManifestStaticFileStorage of whitenoise in below code , but the def _save() method minifies the copied files but the hashed files are either stored as original or empty due to content.close() method i call during the minification. How can i bypass this so that both the original files and hashed files are minified with terser. class MinifiedCompressedManifestStaticFilesStorage(CompressedManifestStaticFilesStorage): def minify_js(self, content_str): """Minify JavaScript using Terser and validate output.""" terser_path = ( os.path.abspath("./node_modules/.bin/terser.cmd") if os.name == "nt" else "./node_modules/.bin/terser" ) try: command = f'"{terser_path}" -m -c' if os.name == "nt" else [terser_path, "-m", "-c"] # Explicitly specify Terser CLI path if installed locally result = subprocess.run( command, input=content_str.encode("utf-8"), capture_output=True, check=True, ) minified = result.stdout if not minified: raise ValueError("Terser returned empty output") return minified except (subprocess.CalledProcessError, FileNotFoundError, ValueError) as e: print(f"Minification failed: {str(e)}. Using original content.") return content_str.encode("utf-8") # Fallback to original def _save(self, name, content): if name.endswith(".js"): # Read and close original content content_str = content.read().decode("utf-8") content.close() # Minify and create new ContentFile minified_bytes = self.minify_js(content_str) content = ContentFile(minified_bytes, name=name) content.seek(0) # Reset pointer for parent class return super()._save(name, content) -
Upwork Graphql job search api correct query and pagination the one on docs is not working
Documenation link : https://www.upwork.com/developer/documentation/graphql/api/docs/index.html#mapping-jobs my query query = """ fragment MarketplaceJobpostingSearchEdgeFragment on MarketplaceJobPostingSearchEdge { node { title description createdDateTime skills { name } duration job { contractTerms { contractType hourlyContractTerms { engagementType } } } client { location { country } totalFeedback totalPostedJobs totalHires verificationStatus totalReviews } } } fragment PageInfoFragment on PageInfo { hasNextPage endCursor } query marketplaceJobPostings( $marketPlaceJobFilter: MarketplaceJobFilter $searchType: MarketplaceJobPostingSearchType $sortAttributes: [MarketplaceJobPostingSearchSortAttribute] $pagination: PaginationInput ) { marketplaceJobPostings( marketPlaceJobFilter: $marketPlaceJobFilter searchType: $searchType sortAttributes: $sortAttributes pagination: $pagination ) { totalCount edges { ...MarketplaceJobpostingSearchEdgeFragment } pageInfo { ...PageInfoFragment } } } """ -
How to create an account approval admin page in Django
I'm trying to create an admin page in Django to approve user accounts. Here are the specific requirements: Background I have a landlord registration system. Each landlord must provide personal information and details about their rental property when registering. Landlord accounts are inactive by default. Administrators need to review the property information before activating the account. Main Models # User model class User(AbstractUser): user_type = models.CharField(max_length=10, choices=UserType, default=UserType.TENANT) phone_number = models.CharField(max_length=10, unique=True, blank=True, null=True) # Other fields... # Property model class Property(models.Model): owner = models.ForeignKey("accounts.User", on_delete=models.CASCADE, related_name="properties") name = models.CharField(max_length=256) status = models.CharField(max_length=10, choices=PropertyStatus.choices, default=PropertyStatus.PENDING) # Other fields... Question I want to create a Django admin page that: Displays a list of unapproved landlord accounts When clicking on an account, shows detailed user information and their registered property details Has functionality to approve or reject the account (if the property is approved, the account will be activated) I've thought about using a custom ModelAdmin with readonly_fields to display detailed information, but I'm not clear on the best way to: Display information from multiple models (User and Property) in the same admin page Add actions to approve/reject accounts What's the best way to implement this? Code examples would be very helpful. … -
CORS Error between 2 local projects Django and React
I have a Django Project running on this url http://127.0.0.1:8000, with those settings DEBUG = True INSTALLED_APPS = [ ... 'corsheaders', ... ] MIDDLEWARE = [ 'corsheaders.middleware.CorsMiddleware', ... 'django.middleware.common.CommonMiddleware', ... ] ALLOWED_HOSTS = [] CORS_ALLOWED_ORIGINS = [ 'http://localhost:5173', 'http://127.0.0.1:5173', ] I have a React project running on this url : http://localhost:5173 with this App.jsx file function App() { // Fetch tournaments from the server const [tournaments, setTournaments] = useState([]); useEffect(() => { fetch('http://127.0.0.1:8000/api/tournaments', { method: 'GET', mode: 'cors', }) .then(response => response.json()) .then(data => setTournaments(data)) .catch(error => console.error('Error fetching tournaments:', error)); }, []); return ( <> <TournamentList tournaments={tournaments} /> </> ) } This request http://127.0.0.1:8000/api/tournaments is working using Postman But I have this basic cors error when using the React Front Access to fetch at 'http://127.0.0.1:8000/api/tournaments' from origin 'http://localhost:5173' has been blocked by CORS policy: No 'Access-Control-Allow-Origin' header is present on the requested resource. What do I miss here ? -
Am trying to Migrate after creating a Django application in Vs code
raise ImproperlyConfigured( django.core.exceptions.ImproperlyConfigured: 'mssql' isn't an available database backend or couldn't be imported. Check the above exception. To use one of the built-in backends, use 'django.db.backends.XXX', where XXX is one of: 'mysql', 'oracle', 'postgresql', 'sqlite3' -
Translating month and day names in django templates
I would like to print day names and month names in the currently selected language, but django keeps showing them in english even if language selection is working fine. In my template: {{ LANGUAGE_CODE }} {% localize on %} {{ day|date:"l" }} {% endlocalize %} The result is: it Friday I was expecting: it Venerdì If I try to show the date with the default format, using this code: {{ day }} , the date format correctly changes depending on the currently selected language, but the day or month names are still not localized.. So for example if english is selected, I get April 25th 2025, while if italian is selected, I get 25th April 2025. Different date format, but April is always in English. How can i translate day\month names? This is my settings.py: USE_I18N = True USE_L10N = True USE_TZ = True -
model.predict hangs in celery/uwsgi
import numpy as np import tensorflow as tf import tensorflow_hub as hub from apps.common.utils.error_handling import suppress_callable_to_sentry from django.conf import settings from threading import Lock MODEL_PATH = settings.BASE_DIR / "apps/core/utils/nsfw_detector/nsfw.299x299.h5" model = tf.keras.models.load_model(MODEL_PATH, custom_objects={"KerasLayer": hub.KerasLayer}, compile=False) IMAGE_DIM = 299 TOTAL_THRESHOLD = 0.9 INDIVIDUAL_THRESHOLD = 0.7 predict_lock = Lock() @suppress_callable_to_sentry(Exception, return_value=False) def is_nsfw(image): if image.mode == "RGBA": image = image.convert("RGB") image = image.resize((IMAGE_DIM, IMAGE_DIM)) image = np.array(image) / 255.0 image = np.expand_dims(image, axis=0) with predict_lock: preds = model.predict(image)[0] categories = ["drawings", "hentai", "neutral", "porn", "sexy"] probabilities = {cat: float(pred) for cat, pred in zip(categories, preds)} individual_nsfw_prob = max(probabilities["porn"], probabilities["hentai"], probabilities["sexy"]) total_nsfw_prob = probabilities["porn"] + probabilities["hentai"] + probabilities["sexy"] return (individual_nsfw_prob > INDIVIDUAL_THRESHOLD) or (total_nsfw_prob > TOTAL_THRESHOLD) This works from python shell and django shell but stucks at predict part in uwsgi and in celery, anyone got any idea why that might be happening? I put a bunch of breakpoints and the problem is at the prediction itself, in shell it returns in ~100ms but it hangs in uwsgi and celery for 10+ minutes (didn't try for longer as I think it is obvious it won't return) Tried it with and without the lock, same result -
Choosing a storage schema for second-by-second telemetry data (PostgreSQL/Django/Grafana)
I am a software developer with limited experience in database design. I am working on an application that receives telemetry packets every second from equipment to an operations center. Each packet is identified by a fixed SID, which defines the data structure: SID 1 → temperature of equipment 1, 2, 3, 4 SID 2 → voltage of batteries 1, 2, 3 etc. Background: Solution 1 (generic EAV) is already implemented and running in production, but it generates a very large data volume. Rewriting the storage layer would be very costly in both development and maintenance. On the Django side, I can display the latest value of each parameter with nanosecond precision without issues. However, in Grafana, any attempt to build a historical dashboard (e.g., one month of temperature data) consumes massive resources (CPU/memory) to execute queries. Objectives: Keep real-time display of the latest values in Django Enable efficient plotting of time series over several months in Grafana without resource exhaustion 1. Solution 1 (generic EAV, in production) I use a single Django model to store each parameter in an Entity–Attribute–Value table: # models.py class ParameterHK(models.Model): id = models.BigAutoField(primary_key=True, db_index=True) date_hk_reception = models.DateTimeField(auto_now_add=True, db_index=True) sid = models.IntegerField(db_index=True) equipment = models.CharField(max_length=500) label_parameter … -
Forbidden (CSRF cookie not set.): /517775/users/approve-decline/4
@method_decorator(csrf_exempt, name='dispatch') class ApproveOrDeclineUserView(APIView): def patch(self, request, org_code, user_id): try: organization = Organization.objects.get(code=org_code) except Organization.DoesNotExist: return Response({'detail': 'Invalid organization code'}, status=status.HTTP_404_NOT_FOUND) try: user = ClientUser.objects.get(id=user_id, organization=organization) except ClientUser.DoesNotExist: return Response({'detail': 'User not found in this organization'}, status=status.HTTP_404_NOT_FOUND) decision = request.data.get('decision') if decision == 'accept': user.is_active = True user.status = 'Active' user.save() # Email credentials send_mail( subject='Your Account Has Been Approved', message=f"Hello {user.full_name},\n\n" f"Your account has been approved.\n\n" f"Login here: {settings.FRONTEND_URL}{organization.code}/login\n\n" f"Welcome aboard!\n\n" f"Best regards,\n" f"ISapce Team\n\n" f"If you have any questions, feel free to reach out to us via email at samsoncoded@gmail.com", from_email=settings.EMAIL_HOST_USER, recipient_list=[user.email], fail_silently=False, ) return Response({'message': 'User approved and email sent.'}, status=status.HTTP_200_OK) elif decision == 'decline': user.delete() return Response({'message': 'User account declined and deleted.'}, status=status.HTTP_200_OK) return Response({'detail': 'Invalid decision. Must be "accept" or "decline".'}, status=status.HTTP_400_BAD_REQUEST) I am stuck. Can someone please tell me where I am getting it all wrong. All other aspect of the application are working but the moment I try to access the endpoint I get an error 403. -
Is any Library in python for matching Users input (char)? [closed]
I want a library in python that match Users inputs with some words. I need it for jango I try to find some library, but rapid fuzz isn't good for it. I have to give my project too my teacher 1 month later please help me.o -
The page that works in Django design does not work in the admin panel. Other subpages work
I have served our page on Windows server, everything works smoothly, but unfortunately the contact page in the admin panel does not work. I have tried many things and done research, but I could not solve the problem. home/urls.py urls.py view/iletisim Error Page -
How to deal with recurrence dates in Django?
I'm currently developing a Gym website and using django-recurrence to handle recurring training sessions. However, I'm unsure how to work with recurrence dates in a Django QuerySet. Specifically, I want to sort trainings by the next upcoming date and filter trainings by a specific date, for example, to show sessions that occur on a selected day. What's the best way to achieve this using django-recurrence? Should I extract the next occurrence manually and store it in the model, or is there a more efficient approach for filtering and sorting recurring events? Any advice or examples would be greatly appreciated! class Training(TimestampModel): template = models.ForeignKey(TrainingTemplate, on_delete=models.CASCADE, related_name='trainings') start_time = models.TimeField() end_time = models.TimeField() location = models.ForeignKey(TrainingLocation, on_delete=models.PROTECT, related_name='trainings') recurrences = RecurrenceField() members = models.ManyToManyField(Member, through=MemberTraining, related_name='trainings', blank=True) objects = TrainingManager() I have a few ideas on how to handle this, but they all feel a bit over complicated. Add a next_training_date field to the Training model and use a Celery ETA task to update this field every time the training ends. For example, when an admin creates a new training, the next_training_date is calculated on save. Then, a scheduled Celery task (with ETA) is created to update this date again after … -
Date and Time setting in Django
My Django (5) instance adds am and pm to times, but here we are used to 24h clocks. So I wish to change this behavior. In my model I have this field: added_date = models.DateTimeField("date added"), it currently displays as: April 23, 2025, 10:01 a.m. I have tried many things now to change this, including these settings: # Internationalization # https://docs.djangoproject.com/en/5.1/topics/i18n/ LANGUAGE_CODE = 'en-us' TIME_ZONE = 'Europe/Amsterdam' USE_I18N = False USE_TZ = True USE_L10N = False DATETIME_FORMAT = "c" I have also tried changing TIME_FORMAT and DATE_FORMAT and various formatting string like "DATE_FORMAT = "%Y-%m-%d" and other things specified here: https://docs.djangoproject.com/en/5.2/ref/settings/#datetime-format, but the only thing that changed anything so far was to do {{ dataset.added_date|time:"H:i" }} in the html. But I'd prefer to just set it in 1 place. How to do that? -
Is there any way to know if an email address exists before creating a user in Django?
I have a RegistrationSerializer in which I crate a user with is_verfied=False. And I wanted to prevent bad intended users to create fake accounts. Even if they can't login due to the email verification step, it would still be a headache if someone just posted a lot of fake users. class RegistrationSerializer(serializers.ModelSerializer): class Meta: model = User fields = ('email', 'password', 'cpf', 'is_active') extra_kwargs = { 'password': {'write_only': True} } def validate(self, data): return data def create(self, validated_data): user = User.objects.create_user(**validated_data) token = generate_email_verification_token.make_token(user) verification_url = f"{os.environ.get('FRONTEND_URL')}/verify-email?uid={user.id}&token={token}" subject = "..." plain_message = ( "...{verification_url}..." ) html_message = f""" ...{verification_url}... """ send_mail( subject, plain_message, settings.DEFAULT_FROM_EMAIL, [user.email], html_message=html_message, fail_silently=False, ) return user And I tried to except send_email to delete the created user if there was any problems while trying to send the email, but nothing worked: try: send_mail( subject, plain_message, settings.DEFAULT_FROM_EMAIL, [user.email], html_message=html_message, fail_silently=False, ) except BadHeaderError: # If mail's Subject is not properly formatted. print('Invalid header found.') User.objects.delete(id=user.id) raise ValidationError except SMTPException as e: # It will catch other errors related to SMTP. User.objects.delete(user=user) print('There was an error sending an email.'+ e) raise ValidationError except: # It will catch All other possible errors. User.objects.delete(user=user) print("Mail Sending Failed!") raise ValidationError So, … -
DRF: How can I validate data vs instance when many=True?
I am using a DRF serializer to update data in bulk. This is how I instantiate the serializer: # Order incoming data and DB instances data = sorted(data, key=lambda x: x["id"]) instances = WorksheetChecklistItem.objects.filter(id__in=(row["id"] for row in data)).order_by("id") # Serialize and save serializer = update.WorksheetChecklistItemSerializer(instance=instances, data=data, many=True) if not serializer.is_valid(): # ... more logic ... And this is the serializer: class WorksheetChecklistItemSerializer(serializers.ModelSerializer): class Meta: model = WorksheetChecklistItem fields = ["id", "value", "outcome"] def update(self, instance, validated_data): instance.outcome = validated_data.get("outcome", instance.outcome) instance.value = validated_data.get("value", instance.value) instance.done = instance.outcome is not None instance.save() return instance def validate(self, data): """Custom validation for the checklist item.""" instance = self.instance if not instance: raise serializers.ValidationError("Instance is required for validation") # Only update is allowed # Validate "must" condition if instance.must and not data.get("done"): raise serializers.ValidationError(f"Checklist item {instance.id} is required but not completed.") # Validate that value is a number if instance.check_type == WorksheetChecklistItem.CheckType.VALUE and not isinstance(data.get("value"), (int, float)): raise serializers.ValidationError(f"Checklist item {instance.id} requires a numeric value.") return data So I'm relying on the default ListSerializer class that is triggered when the serializer is instantiated with the many=True argument. My validation fails because the validate method does not have an "instance" argument like the update method … -
Django: querying two ManyToMany fields on the same model
Given the following models: class Color(models.Model): name = models.CharField() class Child(models.Model): fave_colors = models.ManyToManyField(Color) tshirt_colors = models.ManyToManyField(Color) How would I construct a query to find children who own t-shirts that are their favorite colors? i.e. lucky_kids = Child.objects.filter( fave_colors__exact=tshirt_colors ) # obvious but not valid query -
Using pytest and mongoengine, data is created in the main database instead of a test one
I've installed these packages: python -m pip install pytest pytest-django And created a fixture: # core/services/tests/fixtures/checkout.py import pytest from bson import ObjectId from datetime import datetime from core.models.src.checkout import Checkout @pytest.fixture(scope="session") def checkout(mongo_db): checkout = Checkout( user_id=59, amount=35_641, ) checkout.save() return checkout and imported it in the conftest.py in the same directory: # core/service/tests/conftest.py from core.service.tests.fixtures.checkout import * Here's how I connect to the test database: # conftest.py import pytest from mongoengine import connect, disconnect, connection @pytest.fixture(scope="session", autouse=True) def mongo_db(): connect( db="db", name="testdb", alias="test_db", host="mongodb://localhost:27017/", serverSelectionTimeoutMS=5000, ) connection._connections.clear() yield disconnect() And this is my actual test: import json import pytest from core.service.checkout import a_function def test_a_function(checkout): assert checkout.value is False response = a_function(id=checkout.id, value=True) assert response.status_code == 200 response_data = json.loads(response.content.decode("UTF-8")) assert response_data.get("success", None) is True checkout.reload() assert checkout.value is True But every time I run pytest, a new record is created in the main database. How can I fix this to use a test database?