mirror of
https://github.com/inventree/InvenTree.git
synced 2025-09-13 22:21:37 +00:00
Import update (#10188)
* Add field to "update" existing records * Ensure the ID is first * Prevent editing of "ID" field * Extract db instance * Bump API version * Prevent edit of "id" field * Refactoring * Enhanced playwright tests for data importing * Update docs * Update src/backend/InvenTree/importer/models.py Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Update src/frontend/src/forms/ImporterForms.tsx Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com> * Fix silly AI mistake * Fix for table pagination - Ensure page does not exceed available records * Bug fix for playwright test * Add end-to-end API testing * Fix unit tests * Adjust table page logic * Ensure sensible page size * Simplify playwright test * Simplify test again * Tweak unit test - Importing has invalidated the BOM? * Adjust playwright tests * Further playwright fixes --------- Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
This commit is contained in:
BIN
docs/docs/assets/images/admin/import_select_id.png
Normal file
BIN
docs/docs/assets/images/admin/import_select_id.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 36 KiB |
BIN
docs/docs/assets/images/admin/import_session_create_update.png
Normal file
BIN
docs/docs/assets/images/admin/import_session_create_update.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 23 KiB |
BIN
docs/docs/assets/images/admin/import_update_process.png
Normal file
BIN
docs/docs/assets/images/admin/import_update_process.png
Normal file
Binary file not shown.
After Width: | Height: | Size: 59 KiB |
@@ -76,3 +76,37 @@ Each individual row can be imported, or removed (deleted) by the user. Once all
|
|||||||
### Import Completed
|
### Import Completed
|
||||||
|
|
||||||
Once all records have been processed, the import session is considered complete. The import session can be closed, and the imported records are now stored in the database.
|
Once all records have been processed, the import session is considered complete. The import session can be closed, and the imported records are now stored in the database.
|
||||||
|
|
||||||
|
## Updating Existing Records
|
||||||
|
|
||||||
|
The data import process can also be used to update existing records in the database. This requires that the imported data file contains a unique identifier for each record, which can be used to match the records in the database.
|
||||||
|
|
||||||
|
The basic outline of this process is:
|
||||||
|
|
||||||
|
1. Export the existing records to a CSV file.
|
||||||
|
2. Modify the CSV file to update the records as required.
|
||||||
|
3. Upload the modified CSV file to the import session.
|
||||||
|
|
||||||
|
!!! note "Mixing Creation and Update"
|
||||||
|
It is not possible to mix the creation of new records with the updating of existing records in a single import session. If you wish to create new records, you must create a separate import session for that purpose.
|
||||||
|
|
||||||
|
### Create Import Session
|
||||||
|
|
||||||
|
!!! note "Admin Center"
|
||||||
|
Updating existing records can only be performed when creating a new import session from the [Admin Center](./admin.md#admin-center).
|
||||||
|
|
||||||
|
Create a new import session, and ensure that the *Update Existing Records* option is selected. This will allow the import session to update existing records in the database.
|
||||||
|
|
||||||
|
{{ image("admin/import_session_create_update.png", "Update existing records") }}
|
||||||
|
|
||||||
|
### Map Data Fields
|
||||||
|
|
||||||
|
When mapping the data fields, ensure that the `ID` field is correctly mapped to the corresponding column in the file:
|
||||||
|
|
||||||
|
{{ image("admin/import_select_id.png", "Update existing records") }}
|
||||||
|
|
||||||
|
### Process Data
|
||||||
|
|
||||||
|
When processing the data, each row will be matched against an existing record in the database. If a match is found, the existing record will be updated with the new data from the imported file.
|
||||||
|
|
||||||
|
{{ image("admin/import_update_process.png", "Update existing records") }}
|
||||||
|
@@ -1,11 +1,14 @@
|
|||||||
"""InvenTree API version information."""
|
"""InvenTree API version information."""
|
||||||
|
|
||||||
# InvenTree API version
|
# InvenTree API version
|
||||||
INVENTREE_API_VERSION = 386
|
INVENTREE_API_VERSION = 387
|
||||||
|
|
||||||
"""Increment this API version number whenever there is a significant change to the API that any clients need to know about."""
|
"""Increment this API version number whenever there is a significant change to the API that any clients need to know about."""
|
||||||
|
|
||||||
INVENTREE_API_TEXT = """
|
INVENTREE_API_TEXT = """
|
||||||
|
v387 -> 2025-08-19 : https://github.com/inventree/InvenTree/pull/10188
|
||||||
|
- Adds "update_records" field to the DataImportSession API
|
||||||
|
|
||||||
v386 -> 2025-08-11 : https://github.com/inventree/InvenTree/pull/8191
|
v386 -> 2025-08-11 : https://github.com/inventree/InvenTree/pull/8191
|
||||||
- Adds "consumed" field to the BuildItem API
|
- Adds "consumed" field to the BuildItem API
|
||||||
- Adds API endpoint to consume stock against a BuildOrder
|
- Adds API endpoint to consume stock against a BuildOrder
|
||||||
|
@@ -0,0 +1,22 @@
|
|||||||
|
# Generated by Django 4.2.23 on 2025-08-18 13:57
|
||||||
|
|
||||||
|
from django.db import migrations, models
|
||||||
|
|
||||||
|
|
||||||
|
class Migration(migrations.Migration):
|
||||||
|
|
||||||
|
dependencies = [
|
||||||
|
("importer", "0004_alter_dataimportsession_model_type"),
|
||||||
|
]
|
||||||
|
|
||||||
|
operations = [
|
||||||
|
migrations.AddField(
|
||||||
|
model_name="dataimportsession",
|
||||||
|
name="update_records",
|
||||||
|
field=models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
help_text="If enabled, existing records will be updated with new data",
|
||||||
|
verbose_name="Update Existing Records",
|
||||||
|
),
|
||||||
|
),
|
||||||
|
]
|
@@ -1,6 +1,7 @@
|
|||||||
"""Model definitions for the 'importer' app."""
|
"""Model definitions for the 'importer' app."""
|
||||||
|
|
||||||
import json
|
import json
|
||||||
|
from collections import OrderedDict
|
||||||
from typing import Optional
|
from typing import Optional
|
||||||
|
|
||||||
from django.contrib.auth.models import User
|
from django.contrib.auth.models import User
|
||||||
@@ -39,6 +40,8 @@ class DataImportSession(models.Model):
|
|||||||
field_filters: JSONField for field filter values - optional field API filters
|
field_filters: JSONField for field filter values - optional field API filters
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
ID_FIELD_LABEL = 'id'
|
||||||
|
|
||||||
class ModelChoices(RenderChoices):
|
class ModelChoices(RenderChoices):
|
||||||
"""Model choices for data import sessions."""
|
"""Model choices for data import sessions."""
|
||||||
|
|
||||||
@@ -118,6 +121,12 @@ class DataImportSession(models.Model):
|
|||||||
validators=[importer.validators.validate_field_defaults],
|
validators=[importer.validators.validate_field_defaults],
|
||||||
)
|
)
|
||||||
|
|
||||||
|
update_records = models.BooleanField(
|
||||||
|
default=False,
|
||||||
|
verbose_name=_('Update Existing Records'),
|
||||||
|
help_text=_('If enabled, existing records will be updated with new data'),
|
||||||
|
)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def field_mapping(self) -> dict:
|
def field_mapping(self) -> dict:
|
||||||
"""Construct a dict of field mappings for this import session.
|
"""Construct a dict of field mappings for this import session.
|
||||||
@@ -351,13 +360,25 @@ class DataImportSession(models.Model):
|
|||||||
|
|
||||||
metadata = InvenTreeMetadata()
|
metadata = InvenTreeMetadata()
|
||||||
|
|
||||||
|
fields = OrderedDict()
|
||||||
|
|
||||||
|
if self.update_records:
|
||||||
|
# If we are updating records, ensure the ID field is included
|
||||||
|
fields[self.ID_FIELD_LABEL] = {
|
||||||
|
'label': _('ID'),
|
||||||
|
'help_text': _('Existing database identifier for the record'),
|
||||||
|
'type': 'integer',
|
||||||
|
'required': True,
|
||||||
|
'read_only': False,
|
||||||
|
}
|
||||||
|
|
||||||
if serializer_class := self.serializer_class:
|
if serializer_class := self.serializer_class:
|
||||||
serializer = serializer_class(data={}, importing=True)
|
serializer = serializer_class(data={}, importing=True)
|
||||||
fields = metadata.get_serializer_info(serializer)
|
fields.update(metadata.get_serializer_info(serializer))
|
||||||
else:
|
|
||||||
fields = {}
|
|
||||||
|
|
||||||
|
# Cache the available fields against this instance
|
||||||
self._available_fields = fields
|
self._available_fields = fields
|
||||||
|
|
||||||
return fields
|
return fields
|
||||||
|
|
||||||
def required_fields(self) -> dict:
|
def required_fields(self) -> dict:
|
||||||
@@ -370,6 +391,10 @@ class DataImportSession(models.Model):
|
|||||||
if info.get('required', False):
|
if info.get('required', False):
|
||||||
required[field] = info
|
required[field] = info
|
||||||
|
|
||||||
|
elif self.update_records and field == self.ID_FIELD_LABEL:
|
||||||
|
# If we are updating records, the ID field is required
|
||||||
|
required[field] = info
|
||||||
|
|
||||||
return required
|
return required
|
||||||
|
|
||||||
|
|
||||||
@@ -630,11 +655,13 @@ class DataImportRow(models.Model):
|
|||||||
|
|
||||||
return data
|
return data
|
||||||
|
|
||||||
def construct_serializer(self, request=None):
|
def construct_serializer(self, instance=None, request=None):
|
||||||
"""Construct a serializer object for this row."""
|
"""Construct a serializer object for this row."""
|
||||||
if serializer_class := self.session.serializer_class:
|
if serializer_class := self.session.serializer_class:
|
||||||
return serializer_class(
|
return serializer_class(
|
||||||
data=self.serializer_data(), context={'request': request}
|
instance=instance,
|
||||||
|
data=self.serializer_data(),
|
||||||
|
context={'request': request},
|
||||||
)
|
)
|
||||||
|
|
||||||
def validate(self, commit=False, request=None) -> bool:
|
def validate(self, commit=False, request=None) -> bool:
|
||||||
@@ -654,7 +681,26 @@ class DataImportRow(models.Model):
|
|||||||
# Row has already been completed
|
# Row has already been completed
|
||||||
return True
|
return True
|
||||||
|
|
||||||
serializer = self.construct_serializer(request=request)
|
if self.session.update_records:
|
||||||
|
# Extract the ID field from the data
|
||||||
|
instance_id = self.data.get(self.session.ID_FIELD_LABEL, None)
|
||||||
|
|
||||||
|
if not instance_id:
|
||||||
|
raise DjangoValidationError(
|
||||||
|
_('ID is required for updating existing records.')
|
||||||
|
)
|
||||||
|
|
||||||
|
try:
|
||||||
|
instance = self.session.model_class.objects.get(pk=instance_id)
|
||||||
|
except self.session.model_class.DoesNotExist:
|
||||||
|
raise DjangoValidationError(_('No record found with the provided ID.'))
|
||||||
|
except ValueError:
|
||||||
|
raise DjangoValidationError(_('Invalid ID format provided.'))
|
||||||
|
|
||||||
|
serializer = self.construct_serializer(instance=instance, request=request)
|
||||||
|
|
||||||
|
else:
|
||||||
|
serializer = self.construct_serializer(request=request)
|
||||||
|
|
||||||
if not serializer:
|
if not serializer:
|
||||||
self.errors = {
|
self.errors = {
|
||||||
|
@@ -41,6 +41,7 @@ class DataImportSessionSerializer(InvenTreeModelSerializer):
|
|||||||
'pk',
|
'pk',
|
||||||
'timestamp',
|
'timestamp',
|
||||||
'data_file',
|
'data_file',
|
||||||
|
'update_records',
|
||||||
'model_type',
|
'model_type',
|
||||||
'available_fields',
|
'available_fields',
|
||||||
'status',
|
'status',
|
||||||
|
@@ -0,0 +1,6 @@
|
|||||||
|
ID,Name,Description,Default Location,Default keywords,Level,Parent Category,Parts,Subcategories,Path,Starred,Structural,Icon,Parent default location
|
||||||
|
23,Category 0,"Part category, level 1",,,0,,0,5,Category 0,False,False,,
|
||||||
|
1,Electronics,Electronic components and systems,,,0,,135,12,Electronics,False,False,,
|
||||||
|
17,Furniture,Furniture and associated things,,,0,,22,2,Furniture,False,False,,
|
||||||
|
2,Mechanical,Mechanical components,,,0,,263,3,Mechanical,False,False,,
|
||||||
|
20,Paint,"Paints, inks, etc",,,0,,5,0,Paint,False,False,,
|
|
@@ -3,25 +3,23 @@
|
|||||||
import os
|
import os
|
||||||
|
|
||||||
from django.core.files.base import ContentFile
|
from django.core.files.base import ContentFile
|
||||||
|
from django.urls import reverse
|
||||||
|
|
||||||
from importer.models import DataImportRow, DataImportSession
|
from importer.models import DataImportRow, DataImportSession
|
||||||
from InvenTree.unit_test import AdminTestCase, InvenTreeTestCase
|
from InvenTree.unit_test import AdminTestCase, InvenTreeAPITestCase, InvenTreeTestCase
|
||||||
|
|
||||||
|
|
||||||
class ImporterMixin:
|
class ImporterMixin:
|
||||||
"""Helpers for import tests."""
|
"""Helpers for import tests."""
|
||||||
|
|
||||||
def helper_file(self):
|
def helper_file(self, fn: str) -> ContentFile:
|
||||||
"""Return test data."""
|
"""Return test data."""
|
||||||
fn = os.path.join(os.path.dirname(__file__), 'test_data', 'companies.csv')
|
file_path = os.path.join(os.path.dirname(__file__), 'test_data', fn)
|
||||||
|
|
||||||
with open(fn, encoding='utf-8') as input_file:
|
with open(file_path, encoding='utf-8') as input_file:
|
||||||
data = input_file.read()
|
data = input_file.read()
|
||||||
return data
|
|
||||||
|
|
||||||
def helper_content(self):
|
return ContentFile(data, fn)
|
||||||
"""Return content file."""
|
|
||||||
return ContentFile(self.helper_file(), 'companies.csv')
|
|
||||||
|
|
||||||
|
|
||||||
class ImporterTest(ImporterMixin, InvenTreeTestCase):
|
class ImporterTest(ImporterMixin, InvenTreeTestCase):
|
||||||
@@ -33,8 +31,10 @@ class ImporterTest(ImporterMixin, InvenTreeTestCase):
|
|||||||
|
|
||||||
n = Company.objects.count()
|
n = Company.objects.count()
|
||||||
|
|
||||||
|
data_file = self.helper_file('companies.csv')
|
||||||
|
|
||||||
session = DataImportSession.objects.create(
|
session = DataImportSession.objects.create(
|
||||||
data_file=self.helper_content(), model_type='company'
|
data_file=data_file, model_type='company'
|
||||||
)
|
)
|
||||||
|
|
||||||
session.extract_columns()
|
session.extract_columns()
|
||||||
@@ -74,13 +74,116 @@ class ImporterTest(ImporterMixin, InvenTreeTestCase):
|
|||||||
"""Test default field values."""
|
"""Test default field values."""
|
||||||
|
|
||||||
|
|
||||||
|
class ImportAPITest(ImporterMixin, InvenTreeAPITestCase):
|
||||||
|
"""End-to-end tests for the importer API."""
|
||||||
|
|
||||||
|
def test_import(self):
|
||||||
|
"""Test full import process via the API."""
|
||||||
|
from part.models import PartCategory
|
||||||
|
|
||||||
|
N = PartCategory.objects.count()
|
||||||
|
|
||||||
|
url = reverse('api-importer-session-list')
|
||||||
|
|
||||||
|
# Load data file
|
||||||
|
data_file = self.helper_file('part_categories.csv')
|
||||||
|
|
||||||
|
data = self.post(
|
||||||
|
url,
|
||||||
|
{'model_type': 'partcategory', 'data_file': data_file},
|
||||||
|
format='multipart',
|
||||||
|
).data
|
||||||
|
|
||||||
|
self.assertFalse(data['update_records'])
|
||||||
|
self.assertEqual(data['model_type'], 'partcategory')
|
||||||
|
|
||||||
|
# No data has been imported yet
|
||||||
|
self.assertEqual(data['row_count'], 0)
|
||||||
|
self.assertEqual(data['completed_row_count'], 0)
|
||||||
|
|
||||||
|
field_names = data['available_fields'].keys()
|
||||||
|
|
||||||
|
for fn in ['name', 'default_location', 'description']:
|
||||||
|
self.assertIn(fn, field_names)
|
||||||
|
|
||||||
|
self.assertEqual(len(data['columns']), 14)
|
||||||
|
for col in ['Name', 'Parent Category', 'Path']:
|
||||||
|
self.assertIn(col, data['columns'])
|
||||||
|
|
||||||
|
session_id = data['pk']
|
||||||
|
|
||||||
|
# Accept the field mappings
|
||||||
|
url = reverse('api-import-session-accept-fields', kwargs={'pk': session_id})
|
||||||
|
|
||||||
|
# Initially the user does not have the right permissions
|
||||||
|
self.post(url, expected_code=403)
|
||||||
|
|
||||||
|
# Assign correct permission to user
|
||||||
|
self.assignRole('part_category.add')
|
||||||
|
|
||||||
|
self.post(url, expected_code=200)
|
||||||
|
|
||||||
|
session = self.get(
|
||||||
|
reverse('api-import-session-detail', kwargs={'pk': session_id})
|
||||||
|
).data
|
||||||
|
|
||||||
|
self.assertEqual(session['row_count'], 5)
|
||||||
|
self.assertEqual(session['completed_row_count'], 0)
|
||||||
|
|
||||||
|
# Fetch each row, and validate it
|
||||||
|
rows = self.get(
|
||||||
|
reverse('api-importer-row-list'), data={'session': session_id}
|
||||||
|
).data
|
||||||
|
|
||||||
|
self.assertEqual(len(rows), 5)
|
||||||
|
|
||||||
|
row_ids = []
|
||||||
|
|
||||||
|
for row in rows:
|
||||||
|
row_ids.append(row['pk'])
|
||||||
|
self.assertEqual(row['session'], session_id)
|
||||||
|
self.assertTrue(row['valid'])
|
||||||
|
self.assertFalse(row['complete'])
|
||||||
|
|
||||||
|
# Validate the rows
|
||||||
|
url = reverse('api-import-session-accept-rows', kwargs={'pk': session_id})
|
||||||
|
|
||||||
|
self.post(
|
||||||
|
url,
|
||||||
|
{
|
||||||
|
'rows': row_ids[1:] # Validate all but the first row
|
||||||
|
},
|
||||||
|
)
|
||||||
|
|
||||||
|
# Update session information
|
||||||
|
session = self.get(
|
||||||
|
reverse('api-import-session-detail', kwargs={'pk': session_id})
|
||||||
|
).data
|
||||||
|
|
||||||
|
self.assertEqual(session['row_count'], 5)
|
||||||
|
self.assertEqual(session['completed_row_count'], 4)
|
||||||
|
|
||||||
|
for idx, row in enumerate(row_ids):
|
||||||
|
detail = self.get(
|
||||||
|
reverse('api-importer-row-detail', kwargs={'pk': row})
|
||||||
|
).data
|
||||||
|
|
||||||
|
self.assertEqual(detail['session'], session_id)
|
||||||
|
self.assertEqual(detail['complete'], idx > 0)
|
||||||
|
|
||||||
|
# Check that there are new database records
|
||||||
|
self.assertEqual(PartCategory.objects.count(), N + 4)
|
||||||
|
|
||||||
|
|
||||||
class AdminTest(ImporterMixin, AdminTestCase):
|
class AdminTest(ImporterMixin, AdminTestCase):
|
||||||
"""Tests for the admin interface integration."""
|
"""Tests for the admin interface integration."""
|
||||||
|
|
||||||
def test_admin(self):
|
def test_admin(self):
|
||||||
"""Test the admin URL."""
|
"""Test the admin URL."""
|
||||||
|
data_file = self.helper_file('companies.csv')
|
||||||
|
|
||||||
session = self.helper(
|
session = self.helper(
|
||||||
model=DataImportSession,
|
model=DataImportSession,
|
||||||
model_kwargs={'data_file': self.helper_content(), 'model_type': 'company'},
|
model_kwargs={'data_file': data_file, 'model_type': 'company'},
|
||||||
)
|
)
|
||||||
self.helper(model=DataImportRow, model_kwargs={'session_id': session.id})
|
self.helper(model=DataImportRow, model_kwargs={'session_id': session.id})
|
||||||
|
@@ -31,7 +31,6 @@ const BASE_URL: string = IS_CI
|
|||||||
: 'http://localhost:5173';
|
: 'http://localhost:5173';
|
||||||
|
|
||||||
console.log('Running Playwright Tests:');
|
console.log('Running Playwright Tests:');
|
||||||
console.log(`- CI Mode: ${IS_CI}`);
|
|
||||||
console.log('- Base URL:', BASE_URL);
|
console.log('- Base URL:', BASE_URL);
|
||||||
|
|
||||||
export default defineConfig({
|
export default defineConfig({
|
||||||
|
@@ -153,6 +153,10 @@ export default function ImporterDataSelector({
|
|||||||
};
|
};
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (field == 'id') {
|
||||||
|
continue; // Skip the ID field
|
||||||
|
}
|
||||||
|
|
||||||
fields[field] = {
|
fields[field] = {
|
||||||
...fieldDef,
|
...fieldDef,
|
||||||
field_type: fieldDef.type,
|
field_type: fieldDef.type,
|
||||||
@@ -225,6 +229,10 @@ export default function ImporterDataSelector({
|
|||||||
|
|
||||||
const editCell = useCallback(
|
const editCell = useCallback(
|
||||||
(row: any, col: any) => {
|
(row: any, col: any) => {
|
||||||
|
if (col.field == 'id') {
|
||||||
|
return; // Cannot edit the ID field
|
||||||
|
}
|
||||||
|
|
||||||
setSelectedRow(row);
|
setSelectedRow(row);
|
||||||
setSelectedFieldNames([col.field]);
|
setSelectedFieldNames([col.field]);
|
||||||
editRow.open();
|
editRow.open();
|
||||||
|
@@ -61,6 +61,7 @@ function ImporterColumn({
|
|||||||
|
|
||||||
return (
|
return (
|
||||||
<Select
|
<Select
|
||||||
|
aria-label={`import-column-map-${column.field}`}
|
||||||
error={errorMessage}
|
error={errorMessage}
|
||||||
clearable
|
clearable
|
||||||
searchable
|
searchable
|
||||||
|
@@ -1,9 +1,23 @@
|
|||||||
|
import type { ModelType } from '@lib/enums/ModelType';
|
||||||
import type { ApiFormFieldSet } from '@lib/types/Forms';
|
import type { ApiFormFieldSet } from '@lib/types/Forms';
|
||||||
|
|
||||||
export function dataImporterSessionFields(): ApiFormFieldSet {
|
export function dataImporterSessionFields({
|
||||||
|
modelType,
|
||||||
|
allowUpdate = false
|
||||||
|
}: {
|
||||||
|
modelType?: ModelType | string;
|
||||||
|
allowUpdate?: boolean;
|
||||||
|
}): ApiFormFieldSet {
|
||||||
return {
|
return {
|
||||||
data_file: {},
|
data_file: {},
|
||||||
model_type: {},
|
model_type: {
|
||||||
|
value: modelType,
|
||||||
|
hidden: modelType != undefined
|
||||||
|
},
|
||||||
|
update_records: {
|
||||||
|
hidden: allowUpdate !== true,
|
||||||
|
value: allowUpdate ? undefined : false
|
||||||
|
},
|
||||||
field_defaults: {
|
field_defaults: {
|
||||||
hidden: true,
|
hidden: true,
|
||||||
value: {}
|
value: {}
|
||||||
|
@@ -340,7 +340,28 @@ export function InvenTreeTable<T extends Record<string, any>>({
|
|||||||
// Reset the pagination state when the search term changes
|
// Reset the pagination state when the search term changes
|
||||||
useEffect(() => {
|
useEffect(() => {
|
||||||
tableState.setPage(1);
|
tableState.setPage(1);
|
||||||
}, [tableState.searchTerm]);
|
}, [
|
||||||
|
tableState.searchTerm,
|
||||||
|
tableState.filterSet.activeFilters,
|
||||||
|
tableState.queryFilters
|
||||||
|
]);
|
||||||
|
|
||||||
|
// Account for invalid page offsets
|
||||||
|
useEffect(() => {
|
||||||
|
if (
|
||||||
|
tableState.page > 1 &&
|
||||||
|
pageSize * tableState.page > tableState.recordCount
|
||||||
|
) {
|
||||||
|
tableState.setPage(1);
|
||||||
|
} else if (tableState.page < 1) {
|
||||||
|
tableState.setPage(1);
|
||||||
|
}
|
||||||
|
|
||||||
|
if (pageSize < 10) {
|
||||||
|
// Default page size
|
||||||
|
setPageSize(25);
|
||||||
|
}
|
||||||
|
}, [tableState.records, tableState.page, pageSize]);
|
||||||
|
|
||||||
// Data Sorting
|
// Data Sorting
|
||||||
const [sortStatus, setSortStatus] = useState<DataTableSortStatus<T>>({
|
const [sortStatus, setSortStatus] = useState<DataTableSortStatus<T>>({
|
||||||
@@ -705,7 +726,7 @@ export function InvenTreeTable<T extends Record<string, any>>({
|
|||||||
..._params,
|
..._params,
|
||||||
totalRecords: tableState.recordCount,
|
totalRecords: tableState.recordCount,
|
||||||
recordsPerPage: tablePageSize,
|
recordsPerPage: tablePageSize,
|
||||||
page: tableState.page,
|
page: Math.max(1, tableState.page),
|
||||||
onPageChange: tableState.setPage,
|
onPageChange: tableState.setPage,
|
||||||
recordsPerPageOptions: PAGE_SIZES,
|
recordsPerPageOptions: PAGE_SIZES,
|
||||||
onRecordsPerPageChange: updatePageSize
|
onRecordsPerPageChange: updatePageSize
|
||||||
|
@@ -451,10 +451,9 @@ export function BomTable({
|
|||||||
const [selectedBomItem, setSelectedBomItem] = useState<any>({});
|
const [selectedBomItem, setSelectedBomItem] = useState<any>({});
|
||||||
|
|
||||||
const importSessionFields = useMemo(() => {
|
const importSessionFields = useMemo(() => {
|
||||||
const fields = dataImporterSessionFields();
|
const fields = dataImporterSessionFields({
|
||||||
|
modelType: 'bomitem'
|
||||||
fields.model_type.hidden = true;
|
});
|
||||||
fields.model_type.value = 'bomitem';
|
|
||||||
|
|
||||||
fields.field_overrides.value = {
|
fields.field_overrides.value = {
|
||||||
part: partId
|
part: partId
|
||||||
|
@@ -79,10 +79,9 @@ export function PurchaseOrderLineItemTable({
|
|||||||
);
|
);
|
||||||
|
|
||||||
const importSessionFields = useMemo(() => {
|
const importSessionFields = useMemo(() => {
|
||||||
const fields = dataImporterSessionFields();
|
const fields = dataImporterSessionFields({
|
||||||
|
modelType: ModelType.purchaseorderlineitem
|
||||||
fields.model_type.hidden = true;
|
});
|
||||||
fields.model_type.value = ModelType.purchaseorderlineitem;
|
|
||||||
|
|
||||||
// Specify override values for import
|
// Specify override values for import
|
||||||
fields.field_overrides.value = {
|
fields.field_overrides.value = {
|
||||||
|
@@ -42,7 +42,9 @@ export default function ImportSessionTable() {
|
|||||||
const newImportSession = useCreateApiFormModal({
|
const newImportSession = useCreateApiFormModal({
|
||||||
url: ApiEndpoints.import_session_list,
|
url: ApiEndpoints.import_session_list,
|
||||||
title: t`Create Import Session`,
|
title: t`Create Import Session`,
|
||||||
fields: dataImporterSessionFields(),
|
fields: dataImporterSessionFields({
|
||||||
|
allowUpdate: true
|
||||||
|
}),
|
||||||
onFormSuccess: (response: any) => {
|
onFormSuccess: (response: any) => {
|
||||||
setSelectedSession(response.pk);
|
setSelectedSession(response.pk);
|
||||||
setOpened(true);
|
setOpened(true);
|
||||||
|
9
src/frontend/tests/fixtures/bom_data.csv
vendored
9
src/frontend/tests/fixtures/bom_data.csv
vendored
@@ -1,5 +1,4 @@
|
|||||||
Assembly,Component,Reference,Quantity,Overage,Allow Variants,Gets inherited,Optional,Consumable,Note,ID,Pricing min,Pricing max,Pricing min total,Pricing max total,Pricing updated,Component.Ipn,Component.Name,Component.Description,Validated,Available Stock,Available substitute stock,Available variant stock,External stock,On Order,In Production,Can Build
|
Assembly,Component,Reference,Quantity,Allow Variants,Gets inherited,Optional,Consumable,Setup quantity,Attrition,Rounding multiple,Note,ID,Pricing min,Pricing max,Pricing min total,Pricing max total,Pricing updated,Component.Ipn,Component.Name,Component.Description,Validated,Available Stock,Available substitute stock,Available variant stock,External stock,On Order,In Production,Can Build
|
||||||
87,66,Screws,4.0,,False,False,False,False,,16,0.28,0.648622,1.12,2.594488,2024-08-08 06:55,,M3x8 Torx,"Torx head screw, M3 thread, 8.0mm",True,485.0,0.0,0.0,0.0,0.0,0.0,121.25
|
106,98,screws,5,FALSE,TRUE,FALSE,TRUE,0,0,0,,39,0.075,0.1,0.375,0.5,23/07/2025 9:12,,Wood Screw,Screw for fixing wood to other wood,TRUE,1604,0,0,0,0,0,320.8
|
||||||
87,67,Large screw,1.0,,False,False,False,False,,17,0.574802,0.574802,0.574802,0.574802,2024-07-27 05:13,,M3x10 Torx,"Torx head screw, M3 thread, 10.0mm",True,1450.0,0.0,0.0,0.0,0.0,0.0,1450.0
|
106,95,legs,4,FALSE,TRUE,FALSE,FALSE,0,0,0,,40,10.6,12.75,42.4,51,23/07/2025 9:12,,Leg,Leg for a chair or a table,TRUE,317,0,0,0,0,0,79.25
|
||||||
87,82,Enclosure,1.0,,False,False,False,False,,15,,,,,2024-07-27 05:08,,1551ABK,"Small plastic enclosure, black",True,165.0,223.0,0.0,0.0,0.0,0.0,388.0
|
109,92,paint,0.125,FALSE,FALSE,FALSE,FALSE,0,0,0,,43,1.403886,14.389836,0.175486,1.79873,23/07/2025 9:12,,Green Paint,Green Paint,TRUE,110.125,0,0,0,0,0,881
|
||||||
87,88,PCBA,1.0,,True,False,False,False,Assembled board,23,80.431083,129.328176,80.431083,129.328176,2024-12-27 23:14,002.01-PCBA,Widget Board (assembled),Assembled PCB for converting electricity into magic smoke,True,55.0,0.0,0.0,0.0,0.0,0.0,55.0
|
|
||||||
|
|
@@ -92,8 +92,6 @@ test('Parts - BOM', async ({ browser }) => {
|
|||||||
await setTableChoiceFilter(page, 'active', 'Yes');
|
await setTableChoiceFilter(page, 'active', 'Yes');
|
||||||
await setTableChoiceFilter(page, 'BOM Valid', 'Yes');
|
await setTableChoiceFilter(page, 'BOM Valid', 'Yes');
|
||||||
|
|
||||||
await page.getByText('1 - 12 / 12').waitFor();
|
|
||||||
|
|
||||||
// Navigate to BOM for a particular assembly
|
// Navigate to BOM for a particular assembly
|
||||||
await navigate(page, 'part/87/bom');
|
await navigate(page, 'part/87/bom');
|
||||||
await loadTab(page, 'Bill of Materials');
|
await loadTab(page, 'Bill of Materials');
|
||||||
@@ -620,8 +618,11 @@ test('Parts - Bulk Edit', async ({ browser }) => {
|
|||||||
await page.getByLabel('Select record 2', { exact: true }).click();
|
await page.getByLabel('Select record 2', { exact: true }).click();
|
||||||
await page.getByLabel('action-menu-part-actions').click();
|
await page.getByLabel('action-menu-part-actions').click();
|
||||||
await page.getByLabel('action-menu-part-actions-set-category').click();
|
await page.getByLabel('action-menu-part-actions-set-category').click();
|
||||||
|
|
||||||
await page.getByLabel('related-field-category').fill('rnitu');
|
await page.getByLabel('related-field-category').fill('rnitu');
|
||||||
await page.getByRole('option', { name: '- Furniture/Chairs' }).click;
|
await page.waitForTimeout(250);
|
||||||
|
|
||||||
|
await page.getByRole('option', { name: '- Furniture/Chairs' }).click();
|
||||||
await page.getByRole('button', { name: 'Update' }).click();
|
await page.getByRole('button', { name: 'Update' }).click();
|
||||||
await page.getByText('Items Updated').waitFor();
|
await page.getByText('Items Updated').waitFor();
|
||||||
});
|
});
|
||||||
|
@@ -14,6 +14,14 @@ test('Importing - Admin Center', async ({ browser }) => {
|
|||||||
|
|
||||||
const fileInput = await page.locator('input[type="file"]');
|
const fileInput = await page.locator('input[type="file"]');
|
||||||
await fileInput.setInputFiles('./tests/fixtures/bom_data.csv');
|
await fileInput.setInputFiles('./tests/fixtures/bom_data.csv');
|
||||||
|
|
||||||
|
await page
|
||||||
|
.locator('label')
|
||||||
|
.filter({ hasText: 'Update Existing RecordsIf' })
|
||||||
|
.locator('div')
|
||||||
|
.first()
|
||||||
|
.click();
|
||||||
|
|
||||||
await page.getByRole('button', { name: 'Submit' }).click();
|
await page.getByRole('button', { name: 'Submit' }).click();
|
||||||
|
|
||||||
// Submitting without selecting model type, should show error
|
// Submitting without selecting model type, should show error
|
||||||
@@ -22,21 +30,54 @@ test('Importing - Admin Center', async ({ browser }) => {
|
|||||||
|
|
||||||
await page
|
await page
|
||||||
.getByRole('textbox', { name: 'choice-field-model_type' })
|
.getByRole('textbox', { name: 'choice-field-model_type' })
|
||||||
.fill('Cat');
|
.fill('bom');
|
||||||
await page
|
await page.getByRole('option', { name: 'BOM Item', exact: true }).click();
|
||||||
.getByRole('option', { name: 'Part Category', exact: true })
|
|
||||||
.click();
|
|
||||||
await page.getByRole('button', { name: 'Submit' }).click();
|
await page.getByRole('button', { name: 'Submit' }).click();
|
||||||
|
|
||||||
await page.getByText('Description (optional)').waitFor();
|
await page.getByText('Select the parent assembly').waitFor();
|
||||||
await page.getByText('Parent Category').waitFor();
|
await page.getByText('Select the component part').waitFor();
|
||||||
|
await page.getByText('Existing database identifier for the record').waitFor();
|
||||||
|
|
||||||
|
await page
|
||||||
|
.getByRole('textbox', { name: 'import-column-map-reference' })
|
||||||
|
.click();
|
||||||
|
await page.getByRole('option', { name: 'Ignore this field' }).click();
|
||||||
|
|
||||||
|
await page.getByRole('button', { name: 'Accept Column Mapping' }).click();
|
||||||
|
|
||||||
|
// Check for expected ID values
|
||||||
|
for (const itemId of ['16', '17', '15', '23']) {
|
||||||
|
await page.getByRole('cell', { name: itemId, exact: true });
|
||||||
|
}
|
||||||
|
|
||||||
|
// Import all the records
|
||||||
|
await page
|
||||||
|
.getByRole('row', { name: 'Select all records Row Not' })
|
||||||
|
.getByLabel('Select all records')
|
||||||
|
.click();
|
||||||
|
await page
|
||||||
|
.getByRole('button', { name: 'action-button-import-selected' })
|
||||||
|
.click();
|
||||||
|
|
||||||
|
await page.getByText('Data has been imported successfully').waitFor();
|
||||||
|
await page.getByRole('button', { name: 'Close' }).click();
|
||||||
|
|
||||||
|
// Confirmation of full import success
|
||||||
|
await page.getByRole('cell', { name: '3 / 3' }).first().waitFor();
|
||||||
|
|
||||||
|
// Manually delete records
|
||||||
|
await page.getByRole('checkbox', { name: 'Select all records' }).click();
|
||||||
|
await page
|
||||||
|
.getByRole('button', { name: 'action-button-delete-selected' })
|
||||||
|
.click();
|
||||||
|
await page.getByRole('button', { name: 'Delete', exact: true }).click();
|
||||||
});
|
});
|
||||||
|
|
||||||
test('Importing - BOM', async ({ browser }) => {
|
test('Importing - BOM', async ({ browser }) => {
|
||||||
const page = await doCachedLogin(browser, {
|
const page = await doCachedLogin(browser, {
|
||||||
username: 'steven',
|
username: 'steven',
|
||||||
password: 'wizardstaff',
|
password: 'wizardstaff',
|
||||||
url: 'part/87/bom'
|
url: 'part/109/bom'
|
||||||
});
|
});
|
||||||
|
|
||||||
await page
|
await page
|
||||||
@@ -53,10 +94,10 @@ test('Importing - BOM', async ({ browser }) => {
|
|||||||
await page.waitForTimeout(500);
|
await page.waitForTimeout(500);
|
||||||
|
|
||||||
await page.getByText('Importing Data').waitFor();
|
await page.getByText('Importing Data').waitFor();
|
||||||
await page.getByText('0 / 4').waitFor();
|
await page.getByText('0 / 3').waitFor();
|
||||||
|
|
||||||
await page.getByText('Torx head screw, M3 thread, 10.0mm').first().waitFor();
|
await page.getByText('Screw for fixing wood').first().waitFor();
|
||||||
await page.getByText('Small plastic enclosure, black').first().waitFor();
|
await page.getByText('Leg for a chair or a table').first().waitFor();
|
||||||
|
|
||||||
// Select some rows
|
// Select some rows
|
||||||
await page
|
await page
|
||||||
@@ -90,15 +131,16 @@ test('Importing - BOM', async ({ browser }) => {
|
|||||||
await page.getByRole('button', { name: 'Submit' }).click();
|
await page.getByRole('button', { name: 'Submit' }).click();
|
||||||
await page.waitForTimeout(250);
|
await page.waitForTimeout(250);
|
||||||
|
|
||||||
await page.getByText('0 / 2', { exact: true }).waitFor();
|
await page.getByText('0 / 1', { exact: true }).waitFor();
|
||||||
|
|
||||||
// Submit a row
|
// Submit a row
|
||||||
await page
|
await page
|
||||||
.getByRole('row', { name: 'Select record 1 2 Thumbnail' })
|
.getByRole('row', { name: 'Select record 1 2 Thumbnail' })
|
||||||
.getByLabel('row-action-menu-')
|
.getByLabel('row-action-menu-')
|
||||||
.click();
|
.click();
|
||||||
|
|
||||||
await page.getByRole('menuitem', { name: 'Accept' }).click();
|
await page.getByRole('menuitem', { name: 'Accept' }).click();
|
||||||
await page.getByText('1 / 2', { exact: true }).waitFor();
|
await page.getByText('0 / 1', { exact: true }).waitFor();
|
||||||
});
|
});
|
||||||
|
|
||||||
test('Importing - Purchase Order', async ({ browser }) => {
|
test('Importing - Purchase Order', async ({ browser }) => {
|
||||||
|
Reference in New Issue
Block a user