Compare commits

...

233 Commits

Author SHA1 Message Date
Guillermo Bonet c79867188a Merge branch 'test' into dev
gitea/salix/pipeline/head There was a failure building this commit Details
2025-02-14 12:36:24 +01:00
Guillermo Bonet dce263992e fix: refs #8227 Roadmap columns
gitea/salix/pipeline/head This commit looks good Details
2025-02-14 12:36:06 +01:00
Carlos Andrés a874b96c5e revert a119d9f7fb
gitea/salix/pipeline/head There was a failure building this commit Details
revert feat: invoiceIn move deductible field from head to lines
2025-02-14 11:32:45 +00:00
Carlos Andrés a119d9f7fb feat: invoiceIn move deductible field from head to lines
gitea/salix/pipeline/head This commit looks good Details
2025-02-14 12:29:56 +01:00
Guillermo Bonet 309a287748 Merge branch 'test' into dev
gitea/salix/pipeline/head This commit looks good Details
2025-02-14 12:08:43 +01:00
Guillermo Bonet d9e177d6b4 Merge pull request 'feat: refs #8227 Update roadmap triggers to manage eta adjustments and prevent recursive calls' (!3428) from 8227-roadmapChanges into test
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3428
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-14 10:54:18 +00:00
Guillermo Bonet aca49505bd Merge branch 'test' into 8227-roadmapChanges
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-14 10:52:40 +00:00
Guillermo Bonet 28deadfbad fix: refs #8573 version
gitea/salix/pipeline/head This commit looks good Details
2025-02-14 10:37:08 +01:00
Guillermo Bonet d6f08d7e27 feat: refs #8227 update roadmap triggers and views, remove obsolete trigger and column
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-14 10:04:04 +01:00
Alex Moreno 8110e88aa7 fix: vnUser, default false
gitea/salix/pipeline/head There was a failure building this commit Details
2025-02-14 08:29:55 +01:00
Ivan Mas 9000becf07 Merge pull request 'refactor: refs #8573 add fk to expedition.hostFk' (!3447) from 8573-addFkExpeditionHost into dev
gitea/salix/pipeline/head There was a failure building this commit Details
Reviewed-on: #3447
Reviewed-by: Guillermo Bonet <guillermo@verdnatura.es>
2025-02-14 07:06:21 +00:00
Ivan Mas 5a37bd332e Merge branch 'dev' into 8573-addFkExpeditionHost
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-14 06:55:47 +00:00
Alex Moreno 44ed6254c3 Merge branch 'test' of https://gitea.verdnatura.es/verdnatura/salix into dev
gitea/salix/pipeline/head This commit looks good Details
2025-02-14 07:02:45 +01:00
Carlos Andrés 4e9bcf530b Merge branch 'master' of https://gitea.verdnatura.es/verdnatura/salix into test
gitea/salix/pipeline/pr-dev This commit looks good Details
gitea/salix/pipeline/head This commit looks good Details
2025-02-13 18:59:26 +01:00
Carlos Andrés b06ffdff52 Actualizar modules/ticket/back/methods/ticket/closeAll.js
gitea/salix/pipeline/head This commit looks good Details
2025-02-13 16:42:20 +00:00
Carlos Andrés 536203e0a1 Merge pull request 'fix: facturacion por consignatario en el cierre nocturno y unificación de backs' (!3453) from Hotfix-cierre-facturacion into master
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3453
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-13 16:37:00 +00:00
Ivan Mas 13a76e5c70 Merge branch 'dev' into 8573-addFkExpeditionHost
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-13 15:43:56 +00:00
Ivan Mas caa921020d refactor: refs #8573 update before alter table
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-13 16:41:29 +01:00
Carlos Andrés 7f32a10c06 fix: facturacion por consignatario en el cierre nocturno y unificación de backs
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-13 15:22:37 +01:00
Guillermo Bonet 845ce8d6d8 feat: refs #8227 Minor changes
gitea/salix/pipeline/pr-test There was a failure building this commit Details
2025-02-13 14:36:09 +01:00
Guillermo Bonet 4042679c9b feat: refs #8227 Grants
gitea/salix/pipeline/pr-test There was a failure building this commit Details
2025-02-13 14:12:44 +01:00
Pako Natek 86091571cd Merge pull request 'fix(item_getBalance): refs #8408 availabled field prevails over landed' (!3449) from 8408-Disponible-por-zonas-y-horas into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3449
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-13 12:39:29 +00:00
Carlos Andrés b2071994da fix: update closeAll method to improve date handling and ticket selection logic
gitea/salix/pipeline/head This commit looks good Details
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-13 13:28:14 +01:00
Carlos Satorres cd258eb44e Merge pull request 'fix: refs #6553 remove business' (!3451) from 6553-warnFixBusinessSummary into test
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3451
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-13 11:14:15 +00:00
Jorge Penadés e99fbca255 Merge pull request 'feat: refs #8571 enhance email formatting in sendToSupport function with structured HTML table' (!3445) from 8571-hotfix-notShowToken into master
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3445
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
2025-02-13 11:01:37 +00:00
Jorge Penadés fa12debcd7 Merge branch 'master' of https://gitea.verdnatura.es/verdnatura/salix into 8571-hotfix-notShowToken
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-13 11:59:36 +01:00
Carlos Satorres 1b7026fa00 fix: refs #6553 remove business
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-13 10:46:48 +01:00
Guillermo Bonet bd53f7367c Merge branch 'test' into 8227-roadmapChanges
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-13 08:34:20 +00:00
Guillermo Bonet 92ab3648e7 feat: refs #8227 Fix tests
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-13 09:22:45 +01:00
Pablo Natek 4b7c20075b Merge pull request 'feat: refs #6897 add search method and enhance ACL permissions for Entry model' (!3448) from 6897-addItemSearch into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3448
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
2025-02-13 07:59:58 +00:00
Pako Natek 3a1849326b fix(item_getBalance): refs #8408 availabled field prevails over landed
gitea/salix/pipeline/pr-dev This commit looks good Details
Refs: #8408
2025-02-13 08:41:13 +01:00
Pako Natek d773aec0f5 Merge pull request 'feat(productionControl and collection_new): refs #8575 new itempackingtype a' (!3444) from 8575-itemPackingType-Altillo into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3444
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-13 06:45:37 +00:00
Pako Natek 317c152c66 Merge branch 'dev' into 8575-itemPackingType-Altillo
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-13 06:41:43 +00:00
Pablo Natek 514ddf1045 feat: refs #6897 add search method and enhance ACL permissions for Entry model
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-12 19:40:22 +01:00
Ivan Mas f12c47cdf8 Merge branch 'dev' into 8573-addFkExpeditionHost
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-12 16:11:53 +00:00
Ivan Mas 5404f895b2 refactor: refs #8573 add fk to expedition.hostFk
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-12 17:08:45 +01:00
Javier Segarra 0b8a54d057 Merge pull request 'Fix TicketNegative' (!3446) from fix_ticketNegative into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3446
Reviewed-by: Jon Elias <jon@verdnatura.es>
2025-02-12 15:43:59 +00:00
Jon Elias 307c8d92df Merge branch 'dev' into fix_ticketNegative
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-12 15:30:09 +00:00
Javier Segarra f631aa1314 fix: remotMethodCtx
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-12 16:26:01 +01:00
Guillermo Bonet 8959eb21f6 Merge branch 'test' into dev
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 14:37:34 +01:00
Guillermo Bonet 5543da2c80 Merge branch 'master' into test
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 14:37:10 +01:00
Guillermo Bonet ef9ca5f56c Merge pull request 'refactor: refs #6944 Update ticket_setState to improve state change logic and user tracking' (!3439) from 6944-stateNoRepeat into master
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3439
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-12 13:36:14 +00:00
Carlos Andrés 65a4967d46 fix: update maxShipped to use toDate in closeAll method
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 14:16:43 +01:00
Jorge Penadés 3754ede42d feat: refs #8571 enhance email formatting in sendToSupport function with structured HTML table
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-12 14:15:53 +01:00
Guillermo Bonet ba3909a984 refactor: refs #6944 Requested changes
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-12 14:13:39 +01:00
Pako Natek b95db2eff1 feat(productionControl and collection_new): refs #8575 new itempackingtype a
gitea/salix/pipeline/pr-dev This commit looks good Details
Refs: #8575
2025-02-12 13:45:07 +01:00
Carlos Andrés 9785ab5a7d feat: enhance ticket closure process with error handling and email notifications
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 13:23:57 +01:00
Carlos Andrés b40981aa03 feat: enhance ticket closure process with error handling and email notifications
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 13:18:18 +01:00
Carlos Andrés fad95f2cf9 Merge pull request 'hotFix_daily_addressInvoice_2' (!3441) from hotFix_daily_addressInvoice_2 into test
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3441
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
2025-02-12 11:22:08 +00:00
Alex Moreno 5d155ef6bd Merge branch 'test' into hotFix_daily_addressInvoice_2
gitea/salix/pipeline/pr-test This commit looks good Details
2025-02-12 11:17:27 +00:00
Carlos Andrés 1c9417556b feat: enhance ticket closure process with error handling and email notifications
gitea/salix/pipeline/pr-test There was a failure building this commit Details
2025-02-12 12:13:23 +01:00
Guillermo Bonet e7c027a8b1 refactor: refs #6944 Update ticket_setState to improve state change logic and user tracking
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-12 10:06:48 +01:00
Javi Gallego 1c8ad94ab8 fix: update SQL fixture values and enhance getVideoList method with transaction handling
gitea/salix/pipeline/head This commit looks good Details
2025-02-12 09:51:56 +01:00
Pablo Natek 8d0fec4ffd Merge pull request 'feat: refs #6897 add EntryConfig model and enhance entry filtering with new parameters' (!3366) from 6897-refactorEntryBuyList into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3366
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
2025-02-12 06:37:43 +00:00
Pablo Natek c0b1f3337c Merge branch 'dev' into 6897-refactorEntryBuyList
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 15:50:28 +00:00
Pablo Natek f347d9668f refactor: refs #6897 improve variable scope and query parameters in recalcEntryPrices.js
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-11 16:50:00 +01:00
Carlos Satorres 798d8a514e Merge pull request 'fix: hotfix delivery-note' (!3437) from hotfix-producerPdf into master
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3437
Reviewed-by: Alex Moreno <alexm@verdnatura.es>
2025-02-11 15:13:27 +00:00
Carlos Satorres b49df364bf Merge branch 'master' into hotfix-producerPdf
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-11 14:58:16 +00:00
Carlos Satorres 92232b34ce fix: hotfix delivery-note
gitea/salix/pipeline/pr-master This commit looks good Details
2025-02-11 15:55:08 +01:00
Alex Moreno 3fbd740a2b fix: address invoice daily
gitea/salix/pipeline/pr-master There was a failure building this commit Details
2025-02-11 14:31:26 +01:00
Guillermo Bonet f682e3cfe6 Merge branch 'test' into dev
gitea/salix/pipeline/head This commit looks good Details
2025-02-11 13:26:18 +01:00
Guillermo Bonet 0a33e05bff Merge branch 'master' into test
gitea/salix/pipeline/head This commit looks good Details
2025-02-11 13:26:02 +01:00
Guillermo Bonet a387e3ae92 feat: refs #7162 Add packages and packagesList to ticket_doCmr procedure
gitea/salix/pipeline/head This commit looks good Details
2025-02-11 13:25:30 +01:00
Pablo Natek 96248132a1 refactor: refs #6897 sql fixture data for improved readability and consistency
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 13:02:50 +01:00
Pablo Natek 454fbcb7ce Merge branch 'dev' of https: refs #6897//gitea.verdnatura.es/verdnatura/salix into 6897-refactorEntryBuyList
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-11 12:38:42 +01:00
Javier Segarra 5df24d0e70 Merge pull request '#6321 - Negative tickets' (!1945) from 6321_negative_tickets into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #1945
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-11 08:45:32 +00:00
Javier Segarra 095e561c82 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 08:43:21 +00:00
Pako Natek 0bd345b6de Merge pull request '8408-Disponible-por-zonas-y-horas' (!3432) from 8408-Disponible-por-zonas-y-horas into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3432
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-11 07:50:15 +00:00
Pako Natek b4fe620f2e Merge branch 'dev' into 8408-Disponible-por-zonas-y-horas
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 07:47:16 +00:00
Pako Natek 5a8f7b2c1a Merge branch '8408-Disponible-por-zonas-y-horas' of https://gitea.verdnatura.es/verdnatura/salix into 8408-Disponible-por-zonas-y-horas
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 08:46:26 +01:00
Pako Natek fbf56ff0cf fix(available_refresh): refs #8408 more availabled cases
refs#8408
2025-02-11 08:46:24 +01:00
Pablo Natek 5ab45831e7 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6897-refactorEntryBuyList 2025-02-11 08:19:31 +01:00
Pako Natek cdb91c06c2 Merge pull request '8408-Disponible-por-zonas-y-horas' (!3431) from 8408-Disponible-por-zonas-y-horas into dev
gitea/salix/pipeline/head This commit looks good Details
Reviewed-on: #3431
Reviewed-by: Javi Gallego <jgallego@verdnatura.es>
2025-02-11 07:16:38 +00:00
Pako Natek 1a92a00cce Merge branch 'dev' into 8408-Disponible-por-zonas-y-horas
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-11 07:12:25 +00:00
Pako Natek 5d674139fa Merge branch '8408-Disponible-por-zonas-y-horas' of https://gitea.verdnatura.es/verdnatura/salix into 8408-Disponible-por-zonas-y-horas
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-10 14:00:36 +01:00
Pako Natek ec5ef3d7f8 fix(item_getStock): refs #8408 field availabled used for itementryin selection
Refs: #8408
2025-02-10 14:00:34 +01:00
Guillermo Bonet 75b6867be8 feat: refs #8227 Minor change
gitea/salix/pipeline/pr-test There was a failure building this commit Details
2025-02-10 13:06:05 +01:00
Guillermo Bonet e19e50de14 feat: refs #8227 Update roadmap triggers to manage eta adjustments and prevent recursive calls
gitea/salix/pipeline/pr-test There was a failure building this commit Details
2025-02-10 13:04:23 +01:00
Pablo Natek 6110295cc2 fix: refs #6897 update entry_clone method to return newEntryId instead of result
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-10 11:40:54 +01:00
Pablo Natek 2f0cd27ed8 Merge branch 'dev' of https: refs #6897//gitea.verdnatura.es/verdnatura/salix into 6897-refactorEntryBuyList
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-09 18:26:52 +01:00
Pablo Natek 6ea4e3096e feat: refs #6897 add maxLockTime parameter to entryConfig insert statement
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-09 18:24:39 +01:00
Javier Segarra e748c3ea68 feat: refs #6321 minor changes
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-07 22:40:00 +01:00
Javier Segarra fce6b13d2d Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-07 16:19:58 +01:00
Javier Segarra 9bb273807d feat: refs #6321 i18n negativeReplaced 2025-02-07 16:18:26 +01:00
Javier Segarra f4dbddbe15 fix: refs #6321 dates in fixtures.before
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-07 09:54:55 +01:00
Javier Segarra 9322360979 fix: refs #6321 dates in fixtures.before
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-07 08:41:49 +01:00
Javier Segarra e035a73e06 feat: refs #6321 i18n es negativeReplaced
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-07 07:57:10 +01:00
Javier Segarra d185530839 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-07 07:56:05 +01:00
Javier Segarra 9390c0efed test: refs #6321 getSimilar.spec.js 2025-02-07 07:53:00 +01:00
Javier Segarra c4e64db9b9 Merge branch '6321_negative_tickets' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-06 23:23:00 +01:00
Javier Segarra ba58746a03 fix: refs #6321 test 2025-02-06 23:22:58 +01:00
Javier Segarra 8398a30e4f Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-06 14:58:55 +00:00
Javier Segarra 4e4d6c3b6a fix: refs #6321 fixtures
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-06 15:20:44 +01:00
Javier Segarra 9626b6c0ff feat: refs #6321 update itemLackDetail 2025-02-06 15:19:39 +01:00
Javier Segarra 338e833c0b feat: refs #6321 i18n 2025-02-06 15:19:23 +01:00
Javier Segarra 8170eafa36 feat: refs #6321 remove ticketConfig var 2025-02-06 15:19:07 +01:00
Javier Segarra 4c786be3af Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-06 12:34:41 +01:00
Javier Segarra 4c7b8212da feat: refs #6321 changes 2025-02-06 12:27:59 +01:00
Javier Segarra 3dd64e4257 feat: refs #6321 sql lackDetail step3 2025-02-06 10:31:43 +01:00
Javier Segarra e736c95fb6 feat: refs #6321 sql lackDetail step2 2025-02-06 10:29:48 +01:00
Javier Segarra 1af01ad747 feat: refs #6321 sql lackDetail step1 2025-02-06 10:26:38 +01:00
Javier Segarra 24411f9af1 fix: refs #6321 revert change
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-06 08:17:03 +01:00
Javier Segarra b2cbded2dc feat: refs #6321 defaultAlertLevelCode
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-06 01:06:02 +01:00
Javier Segarra 1f6f7b9975 feat: refs #6321 updates requested
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-06 00:34:08 +01:00
Javier Segarra da90d43f7a Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-05 15:36:44 +01:00
Javier Segarra e02dcf23b7 feat: refs #6321 add columns ticketConfigs 2025-02-05 15:36:01 +01:00
Pablo Natek 26faaad5b4 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6897-refactorEntryBuyList
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-05 08:03:48 +01:00
Pablo Natek e4cd30bc27 feat: refs #6897 add groupingMode and hasMinPrice parameters to getBuyList method
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-05 07:34:47 +01:00
Javier Segarra 55eb882754 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-04 23:42:47 +01:00
Javier Segarra 272c7c0289 perf: refs #6321 minor changes
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-04 23:42:16 +01:00
Javier Segarra 5d209314f6 feat: refs #6321 use Date.vnNew
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-04 14:09:48 +01:00
Javier Segarra 0111373471 Merge branch 'dev' of https: refs #6321//gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-04 14:00:00 +01:00
Javier Segarra 550b0871f0 feat: refs #6321 changes
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-04 13:45:59 +01:00
Javier Segarra 410f3e73dc Merge branch 'dev' of https: refs #6321//gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-04 09:54:41 +01:00
Pablo Natek 13d9cac340 test: refs #6897 update expected results in item and tag filter tests
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-02-03 14:19:39 +01:00
Pablo Natek 8f3bf46165 Merge branch 'dev' of https: refs #6897//gitea.verdnatura.es/verdnatura/salix into 6897-refactorEntryBuyList
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-03 13:36:20 +01:00
Pablo Natek e7dd1f6a58 feat: refs #6897 add recalcEntryPrices method and enhance ACL permissions for entry operations
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-02-03 13:16:48 +01:00
Javier Segarra 49c6df42a7 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-01-31 10:20:42 +00:00
Javier Segarra 767c891317 perf: refs #6321 remove comments
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-01-31 01:17:23 +01:00
Javier Segarra 2574e59c71 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-01-31 01:07:53 +01:00
Javier Segarra cd7add3497 feat: refs #6321 debug 2025-01-31 01:07:39 +01:00
Javier Segarra 7fdd3d1eb8 feat: refs #6321 fix methods 2025-01-31 01:07:28 +01:00
Javier Segarra 9791f3b935 fix: refs #6321 fixtures 2025-01-31 01:06:57 +01:00
Javier Segarra b8894ca67d feat: refs #6321 i18n replaceItem 2025-01-31 01:04:34 +01:00
Javier Segarra 75b4202a7b feat: refs #6321 remove origin 2025-01-31 01:04:14 +01:00
Javier Segarra dc6f93c241 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev This commit looks good Details
2025-01-30 08:54:04 +01:00
Javier Segarra 811feb9fee feat: refs #6321 tour 2025-01-29 23:46:19 +01:00
Javier Segarra ac053814e6 test: refs #6321 fixing test 2025-01-29 12:26:42 +01:00
Javier Segarra a93e8b28db fix: refs #6321 getSimilar
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-01-28 23:37:56 +01:00
Javier Segarra 10eef6d1b6 feat: refs #6321 updates
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-01-28 08:45:00 +01:00
Javier Segarra 1a0992da78 feat: refs #6321 changes 2025-01-27 12:04:18 +01:00
Pablo Natek ef5c2ab3a2 feat: refs #6897 add cloneEntry and deleteEntry methods with corresponding ACL permissions
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-01-27 08:11:41 +01:00
Javier Segarra bd54eacda1 feat: refs #6321 alternative alertLevel 2025-01-26 02:36:39 +01:00
Javier Segarra 36192c14ec Merge branch 'dev' into 6321_negative_tickets 2025-01-25 09:10:45 +01:00
Javier Segarra c3361fd49b Merge branch 'dev' into 6321_negative_tickets 2025-01-23 14:48:28 +01:00
Javier Segarra 2a4bad5034 Merge branch 'dev' into 6321_negative_tickets 2025-01-21 23:43:07 +01:00
Javier Segarra 1560c48af2 feat: refs #6321 improve query 2025-01-20 14:32:26 +01:00
Pablo Natek f8a156b7ab feat: refs #6897 add EntryConfig model and enhance entry filtering with new parameters
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2025-01-17 08:11:18 +01:00
Javier Segarra 84dfdcb79a Merge branch 'dev' into 6321_negative_tickets 2025-01-14 12:43:25 +01:00
Javier Segarra 53298bd9ca Merge branch 'dev' into 6321_negative_tickets 2024-12-09 14:24:34 +01:00
Javier Segarra 62dd5cb675 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-09-24 22:12:21 +02:00
Javier Segarra c6490f6740 feat(salix): refs #6321 #6321 fixtures.before 2024-09-24 22:12:11 +02:00
Javier Segarra c876022fe5 feat(salix): refs #6321 #6321 TODO 2024-09-24 13:54:32 +02:00
Javier Segarra 2cb57225ff perf(salix): refs #6321 #7677 itemLackDetail 2024-09-21 00:26:55 +02:00
Javier Segarra 5e38d18fed Merge branch 'dev' into 6321_negative_tickets 2024-09-19 09:34:55 +02:00
Javier Segarra a93dd79fe2 Merge branch 'dev' into 6321_negative_tickets 2024-09-19 00:00:42 +02:00
Javier Segarra 36297009e1 perf(salix): refs #6321 #7677 itemLackDetail 2024-09-17 16:43:09 +02:00
Javier Segarra 03fcabd7f6 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-09-16 09:58:20 +02:00
Javier Segarra e76e2a15f2 feat(salix): refs #6321 #6321 TODO 2024-09-13 09:44:09 +02:00
Javier Segarra 09a7918ab3 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets 2024-09-13 08:51:28 +02:00
Javier Segarra fb851c3bdd feat: refs #6321 implement VnTable 2024-09-12 13:33:22 +02:00
Javier Segarra 94f99ccee1 fix(salix): refs #6321 #6321 remove ticketMethod clone 2024-09-11 11:58:58 +02:00
Javier Segarra 844e96583b Merge branch 'dev' of https: refs #6321//gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets 2024-09-11 08:45:42 +02:00
Javier Segarra f77163102c Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-07-23 10:43:12 +02:00
Javier Segarra b2d58a1d6f Merge branch 'dev' into 6321_negative_tickets 2024-07-22 17:30:25 +02:00
Javier Segarra 0e97c453ed Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-07-22 16:47:05 +02:00
Javier Segarra 7ec47f2f80 Merge branch 'dev' into 6321_negative_tickets 2024-07-22 10:21:50 +02:00
Javier Segarra c9c9d5973d test(salix): refs #6321 #6321 fix test
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-07-20 00:36:59 +02:00
Javier Segarra 7e8c2eebe5 feat: refs #6321 restore fixtures.before.sql 2024-07-19 19:38:21 +02:00
Javier Segarra 212f84aa9b revert commit 2024-07-19 11:41:33 +02:00
Javier Segarra 48b8bda49a Merge branch 'dev' into 6321_negative_tickets 2024-07-19 09:42:42 +02:00
Javier Segarra ac7c28cd27 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-07-18 22:02:22 +02:00
Javier Segarra 99efdffe58 feat(salix): refs #6321 #6321 retrieve observationType 2024-07-04 09:39:06 +02:00
Javier Segarra 8b72b7211e feat(salix): refs #7380 #7380 new typeObservation
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-07-03 23:06:13 +02:00
Javier Segarra 14e14eea2a feat(salix): refs #7380 #7380 client.substitutionAllowed new field 2024-07-03 23:06:06 +02:00
Javier Segarra b5ea2f12ff Merge remote-tracking branch 'origin/dev' into 6321_negative_tickets 2024-07-03 23:05:42 +02:00
Javier Segarra 37de252e15 Merge branch 'dev' into 6321_negative_tickets 2024-07-02 12:22:35 +02:00
Javier Segarra a1c48974c9 Merge branch 'dev' into 6321_negative_tickets 2024-06-20 12:21:24 +02:00
Javier Segarra e87c8ee5a7 feat(Salix): refs #6321 #6427 change url endpoint 2024-06-18 13:17:29 +02:00
Javier Segarra dba76a4f6b test(Salix): refs #6321 #6321 add default items as Proposal 2024-06-17 12:38:36 +02:00
Javier Segarra e45ac6424c perf(salix): refs #6321 #6321 updates 2024-06-14 11:43:06 +02:00
Javier Segarra ab85b8e703 Merge branch 'dev' into 6321_negative_tickets 2024-06-12 22:37:14 +02:00
Javier Segarra 2cbd610bc2 perf(salix): refs #6321 #7563 add ink.showOrder to procedure 2024-06-12 22:19:59 +02:00
Javier Segarra 02bc3afcda Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-06-11 13:53:30 +02:00
Javier Segarra 64a4a78308 feat(salix): refs #6321 updates 2024-06-10 17:09:25 +02:00
Javier Segarra 149aeac54e Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-06-04 09:26:15 +02:00
Javier Segarra 7468f87808 feat(salix): refs #6321 #6321 improve split mehtod
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-30 07:46:10 +02:00
Javier Segarra aab7a7ec73 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-28 12:45:50 +02:00
Javier Segarra 4fe1d80e7c feat(salix): refs #6321 default value when days is not present 2024-05-24 14:00:41 +02:00
Javier Segarra 8366cfa348 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-24 11:12:52 +02:00
Javier Segarra 70f245fd2d Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-15 16:25:44 +02:00
Javier Segarra 3401f0d745 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-15 08:55:10 +02:00
Javier Segarra befc128950 feat(salix): refs #6321 Sale_itemReplace 2024-05-15 08:36:13 +02:00
Javier Segarra 5c0b25bb30 Merge branch '6321_negative_tickets' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-13 13:51:20 +02:00
Javier Segarra 888f15049a feat(salix): refs #6321 #6321 New arg 2024-05-13 13:51:04 +02:00
Jorge Penadés cfea648103 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-13 13:25:52 +02:00
Javier Segarra e30c66313f Merge branch 'dev' into 6321_negative_tickets 2024-05-08 12:29:06 +02:00
Javier Segarra de7469419a feat(salix): refs #6321 #6321 getSimilar minor update 2024-05-03 07:23:12 +02:00
Javier Segarra 7caea44427 feat(salix): refs #6321 #6321 getSimilar
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-02 13:53:28 +02:00
Javier Segarra 63d07cb082 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-05-02 12:47:59 +02:00
Javier Segarra cb76075bf8 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-29 11:55:39 +02:00
Javier Segarra d638e31a1a Merge branch 'dev' into 6321_negative_tickets 2024-04-23 19:15:45 +02:00
Javier Segarra caaa4fdd30 Merge remote-tracking branch 'origin/dev' into 6321_negative_tickets 2024-04-23 11:45:59 +02:00
Javier Segarra 68158f341d feat(salix): refs #6321 #6331 publish negativeOrigin model 2024-04-22 14:09:27 +02:00
Javier Segarra 5a5032f6e6 Merge remote-tracking branch 'origin/dev' into 6321_negative_tickets 2024-04-22 13:34:06 +02:00
Javier Segarra 41f0b6aa93 Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-16 05:13:35 +00:00
Javier Segarra e0712645a2 refs #6321 test: fix
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-08 12:17:32 +02:00
Javier Segarra 3dd162b683 refs #6321 test: fix
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-08 12:16:07 +02:00
Javier Segarra 25fc39ef2b refs #6321 perf: change descriptions
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-08 11:33:32 +02:00
Javier Segarra 134c468589 Merge branch 'dev' into 6321_negative_tickets 2024-04-08 11:30:48 +02:00
Javier Segarra c4f8734d44 refs #6321 fix: param
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-03 09:48:56 +02:00
Javier Segarra cc3f2da639 refs #6321 perf: minor change
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-02 15:02:33 +02:00
Javier Segarra 586f37afd2 refs #6321 perf: add arguments into procedure 2024-04-02 13:28:26 +02:00
Juanjo Breso 9a80f8c2ce minor change 2024-04-02 10:45:29 +02:00
Javier Segarra d08535ac18 refs #6321 minor changes
gitea/salix/pipeline/pr-dev This commit looks good Details
2024-04-02 08:04:27 +02:00
Javier Segarra 5d24844256 refs #6321 test: debug use TIMEOUT
gitea/salix/pipeline/pr-dev This commit looks good Details
2024-04-01 16:12:52 +02:00
Javier Segarra 601f5db080 refs #6321 test: spliy 2024-04-01 16:11:30 +02:00
Javier Segarra 59498179ec refs #6321 test: itemLackDetail 2024-04-01 14:05:45 +02:00
Javier Segarra d225821a41 refs #6321 test: itemLack
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-04-01 13:59:35 +02:00
Javier Segarra d62c55dc9f refs #6321 test: negativeOrigin 2024-04-01 13:11:58 +02:00
Javier Segarra 2cff160c6a Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-28 23:50:51 +00:00
Javier Segarra 65a6174e2b refs #6321 updates
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-28 12:01:06 +01:00
Javier Segarra e6fe245b27 refs #6321 feat: new split method
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-27 14:09:25 +01:00
Javier Segarra d8d0ced918 Merge branch 'dev' into 6321_negative_tickets 2024-03-27 09:54:02 +01:00
Javier Segarra a943e39ba7 refs #6321 feat: negativeOrigin 2024-03-22 22:44:37 +01:00
Javier Segarra e085bc7f1e Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-21 07:47:39 +01:00
Javier Segarra 44c4e6a16e Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-15 09:47:10 +01:00
Javier Segarra 6c0706cc56 refs #6321 perf: query to retrieve results 2024-03-15 09:33:06 +01:00
Javier Segarra f83f7808c8 refs #6321 feat: negativeOrigin method 2024-03-15 09:32:36 +01:00
Javier Segarra 6a12af2eb9 refs #6321 feat: add producerFk
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-14 15:26:30 +01:00
Javier Segarra ed6b25455b refs #5858 feat: improve itemLackDetail 2024-03-13 14:27:45 +01:00
Javier Segarra c8446eb9a1 refs #6321 perf: updatemethod
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-13 09:20:32 +01:00
Javier Segarra d1e7e13333 refs #6321 feat: acl
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-06 14:37:28 +01:00
Javier Segarra 7c8fa52da0 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets 2024-03-06 14:30:02 +01:00
Javier Segarra 871447cc6e refs #6321 feat: updates
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-03-05 08:07:54 +01:00
Javier Segarra bc09ad7da7 Merge branch 'dev' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets 2024-01-29 11:07:31 +01:00
Javier Segarra 1a21dda00b refs #6321 feat itemLackDetail
gitea/salix/pipeline/head There was a failure building this commit Details
gitea/salix/pipeline/pr-dev There was a failure building this commit Details
2024-01-29 09:55:44 +01:00
Javier Segarra 895d9bff64 refs #6321 feat itemLAck with SQL 2024-01-29 09:55:34 +01:00
Javier Segarra f49444c19f Merge branch '6321_negative_tickets' of https://gitea.verdnatura.es/verdnatura/salix into 6321_negative_tickets
gitea/salix/pipeline/head There was a failure building this commit Details
2024-01-23 09:42:59 +01:00
Javier Segarra 48d9a3934a Merge branch 'dev' into 6321_negative_tickets
gitea/salix/pipeline/head This commit looks good Details
2024-01-22 19:29:47 +00:00
Javier Segarra 2bcb6366b2 refs #6321 feat: vCustomWhere 2024-01-22 10:10:38 +01:00
Javier Segarra 0111aa1b75 refs #6321 feat: fixtures and update procedure 2024-01-22 09:56:10 +01:00
Javier Segarra 91f5ee3b93 refs #6321 feat: new remoteMethod
gitea/salix/pipeline/head This commit looks good Details
2024-01-20 12:29:41 +01:00
74 changed files with 2214 additions and 602 deletions

View File

@ -33,25 +33,52 @@ module.exports = Self => {
const emailUser = const emailUser =
await Self.app.models.EmailUser.findById(userId, {fields: ['email']}); await Self.app.models.EmailUser.findById(userId, {fields: ['email']});
let html = `<h2>Motivo: ${reason}</h2>`; const tableStyle = 'width:100%; border-collapse: collapse; text-align: left;';
html += `<h3>Usuario: ${userId} ${emailUser.email}</h3>`; const thStyle = 'padding: 8px; border: 1px solid #ddd; background-color: #f4f4f4;';
html += `<h3>Additional Data:</h3>`; const tdStyle = 'padding: 8px; border: 1px solid #ddd;';
html += '<ul>'; const tdBoldStyle = 'padding: 8px; border: 1px solid #ddd; font-weight: bold;';
const subTdStyle = 'padding: 6px; border: 1px solid #ddd;';
const subTdBoldStyle = 'padding: 6px; border: 1px solid #ddd; font-weight: bold;';
let html = `
<h2>Motivo: ${reason}</h2>
<h3>Usuario: ${userId} ${emailUser.email}</h3>
<h3>Additional Data:</h3>
<table style="${tableStyle}">
<thead>
<tr>
<th style="${thStyle}">Clave</th><th style="${thStyle}">Valor</th></tr>
</thead>
<tbody>`;
for (const [key, val] of Object.entries(additionalData)) { for (const [key, val] of Object.entries(additionalData)) {
if (key !== 'config') html += `<li>${key}: ${parse(val)}</li>`; if (key !== 'config') {
else { html += `<tr>
html += `<li>${key}:</li><ul style="list-style-type: square;">`; <td style="${tdBoldStyle}">${key}</td>
for (const [confKey, confVal] of Object.entries(val)) <td style="${tdStyle}">${parse(val)}</td>
html += `<li>${confKey}: ${parse(confVal)}</li>`; </tr>`;
html += '</ul>'; } else {
html += `<tr>
<td style="${tdBoldStyle}">${key}</td>
<td style="${tdStyle}">
<table style="${tableStyle}">
<tbody>`;
for (const [confKey, confVal] of Object.entries(val)) {
html += `<tr>
<td style="${subTdBoldStyle}">${confKey}</td>
<td style="${subTdStyle}">${parse(confVal)}</td>
</tr>`;
}
html += `</tbody></table></td></tr>`;
} }
} }
html += '</ul>'; html += `</tbody></table>`;
const {message, path, name} = additionalData; const {message, path, name} = additionalData;
const err = name && message ? `${name}: ${message}` : name || message || '';
await smtp.send({ await smtp.send({
to: `${config.app.reportEmail}, ${emailUser.email}`, to: `${config.app.reportEmail}, ${emailUser.email}`,
subject: `[Support-Salix] ${path} ${name}: ${message}`, subject: `[Support-Salix] ${path.split('?')[0]} ${err}`,
html html
}); });
}; };

View File

@ -54,7 +54,8 @@
"type": "string" "type": "string"
}, },
"hasGrant": { "hasGrant": {
"type": "boolean" "type": "boolean",
"default": false
}, },
"passExpired": { "passExpired": {
"type": "date" "type": "date"
@ -168,6 +169,7 @@
"emailVerified", "emailVerified",
"twoFactor" "twoFactor"
] ]
} }
} }
} }

View File

@ -77,8 +77,8 @@ INSERT INTO `vn`.`agency` (`name`, `warehouseFk`, `isOwn`, `isAnyVolumeAllowed`)
('Otra agencia ', '1', '0', '0'); ('Otra agencia ', '1', '0', '0');
INSERT INTO `vn`.`expedition` (`agencyModeFk`, `ticketFk`, `isBox`, `counter`, `workerFk`, `externalId`, `packagingFk`, `hostFk`, `itemPackingTypeFk`, `hasNewRoute`) VALUES INSERT INTO `vn`.`expedition` (`agencyModeFk`, `ticketFk`, `isBox`, `counter`, `workerFk`, `externalId`, `packagingFk`, `hostFk`, `itemPackingTypeFk`, `hasNewRoute`) VALUES
('1', '1', 1, '1', '1', '1', '1', 'pc00', 'F', 0), ('1', '1', 1, '1', '1', '1', '1', 'pc1', 'F', 0),
('1', '1', 1, '2', '1', '1', '1', 'pc00', 'F', 0); ('1', '1', 1, '2', '1', '1', '1', 'pc1', 'F', 0);
INSERT INTO vn.client (id,name,defaultAddressFk,street,fi,email,dueDay,isTaxDataChecked,accountingAccount,city,provinceFk,postcode,socialName,contact,credit,countryFk,quality,riskCalculated) VALUES INSERT INTO vn.client (id,name,defaultAddressFk,street,fi,email,dueDay,isTaxDataChecked,accountingAccount,city,provinceFk,postcode,socialName,contact,credit,countryFk,quality,riskCalculated) VALUES
(100,'root',110,'Valle de la muerte','74974747G','root@mydomain.com',0,1,'4300000078','ALGEMESI',1,'46680','rootSocial','rootContact',500.0,1,10,'2025-01-01'); (100,'root',110,'Valle de la muerte','74974747G','root@mydomain.com',0,1,'4300000078','ALGEMESI',1,'46680','rootSocial','rootContact',500.0,1,10,'2025-01-01');

View File

@ -761,44 +761,45 @@ INSERT INTO `vn`.`route`(`id`, `time`, `workerFk`, `created`, `vehicleFk`, `agen
(7, NULL, 57, util.VN_CURDATE(), 6, 8, 'seventh route', 0, 70, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), util.VN_CURDATE()); (7, NULL, 57, util.VN_CURDATE(), 6, 8, 'seventh route', 0, 70, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), util.VN_CURDATE());
INSERT INTO `vn`.`ticket`(`id`, `priority`, `agencyModeFk`,`warehouseFk`,`routeFk`, `shipped`, `landed`, `clientFk`,`nickname`, `addressFk`, `refFk`, `isDeleted`, `zoneFk`, `zonePrice`, `zoneBonus`, `created`, `weight`, `cmrFk`, `problem`, `risk`) INSERT INTO `vn`.`ticket`(`id`, `priority`, `agencyModeFk`,`warehouseFk`,`routeFk`, `shipped`, `landed`, `clientFk`,`nickname`, `addressFk`, `refFk`, `isDeleted`, `zoneFk`, `zonePrice`, `zoneBonus`, `created`, `weight`, `cmrFk`, `problem`, `risk`)
VALUES VALUES
(1 , 3, 1, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Bat cave', 121, NULL, 0, 1, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), 1, 1, 'hasHighRisk', 901.4), (1 , 3, 1, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Bat cave', 121, NULL, 0, 1, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), 1, 1, 'hasHighRisk', 901.4),
(2 , 1, 1, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Bat cave', 1, NULL, 0, 1, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), 2, 2, 'hasHighRisk', 901.4), (2 , 1, 1, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Bat cave', 1, NULL, 0, 1, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), 2, 2, 'hasHighRisk', 901.4),
(3 , 1, 7, 1, 6, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -2 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 3, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), NULL, 3, NULL, NULL), (3 , 1, 7, 1, 6, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -2 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 3, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), NULL, 3, NULL, NULL),
(4 , 3, 2, 1, 2, DATE_ADD(util.VN_CURDATE(), INTERVAL -3 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -3 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 9, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -3 MONTH), NULL, NULL, NULL, NULL), (4 , 3, 2, 1, 2, DATE_ADD(util.VN_CURDATE(), INTERVAL -3 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -3 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 9, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -3 MONTH), NULL, NULL, NULL, NULL),
(5 , 3, 3, 3, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL -4 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -4 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 10, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -4 MONTH), NULL, NULL, NULL, NULL), (5 , 3, 3, 3, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL -4 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -4 MONTH), INTERVAL +1 DAY), 1104, 'Stark tower', 124, NULL, 0, 10, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -4 MONTH), NULL, NULL, NULL, NULL),
(6 , 1, 3, 3, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Mountain Drive Gotham', 1, NULL, 0, 10, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), NULL, NULL, 'hasHighRisk', 901.4), (6 , 1, 3, 3, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL -1 MONTH), INTERVAL +1 DAY), 1101, 'Mountain Drive Gotham', 1, NULL, 0, 10, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL -1 MONTH), NULL, NULL, 'hasHighRisk', 901.4),
(7 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'Mountain Drive Gotham', 1, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4), (7 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'Mountain Drive Gotham', 1, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4),
(8 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'Bat cave', 121, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4), (8 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'Bat cave', 121, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4),
(9 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1104, 'Stark tower', 124, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (9 , NULL, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1104, 'Stark tower', 124, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(10, 1, 1, 5, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'Ingram Street', 2, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isTooLittle', NULL), (10, 1, 1, 5, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'Ingram Street', 2, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isTooLittle', NULL),
(11, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'NY roofs', 122, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasTicketRequest', NULL), (11, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'NY roofs', 122, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasTicketRequest', NULL),
(12, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (12, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(13, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (13, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(14, 1, 2, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1104, 'Malibu Point', 4, NULL, 0, 9, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (14, 1, 2, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1104, 'Malibu Point', 4, NULL, 0, 9, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(15, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1105, 'An incredibly long alias for testing purposes', 125, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isFreezed', NULL), (15, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1105, 'An incredibly long alias for testing purposes', 125, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isFreezed', NULL),
(16, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1106, 'Many Places', 126, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 388.7), (16, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1106, 'Many Places', 126, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 388.7),
(17, 1, 7, 2, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1106, 'Many Places', 126, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 388.7), (17, 1, 7, 2, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1106, 'Many Places', 126, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 388.7),
(18, 1, 4, 4, 4, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1108, 'Cerebro', 128, NULL, 0, 12, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +12 HOUR), NULL, NULL, 'isFreezed', NULL), (18, 1, 4, 4, 4, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1108, 'Cerebro', 128, NULL, 0, 12, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +12 HOUR), NULL, NULL, 'isFreezed', NULL),
(19, 1, 5, 5, NULL, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1109, 'Somewhere in Thailand', 129, NULL, 1, NULL, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isTaxDataChecked', NULL), (19, 1, 5, 5, NULL, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1109, 'Somewhere in Thailand', 129, NULL, 1, NULL, 5, 1, util.VN_CURDATE(), NULL, NULL, 'isTaxDataChecked', NULL),
(20, 1, 5, 5, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Thailand', 129, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL), (20, 1, 5, 5, 3, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Thailand', 129, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL),
(21, NULL, 5, 5, 5, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Holland', 102, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL), (21, NULL, 5, 5, 5, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Holland', 102, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL),
(22, NULL, 5, 5, 5, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Japan', 103, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL), (22, NULL, 5, 5, 5, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), DATE_ADD(DATE_ADD(util.VN_CURDATE(),INTERVAL +1 MONTH), INTERVAL +1 DAY), 1109, 'Somewhere in Japan', 103, NULL, 0, 13, 5, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL +1 MONTH), NULL, NULL, 'isTaxDataChecked', NULL),
(23, NULL, 8, 1, 7, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'address 21', 121, NULL, 0, 5, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasTicketRequest, hasHighRisk', 901.4), (23, NULL, 8, 1, 7, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1101, 'address 21', 121, NULL, 0, 5, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasTicketRequest, hasHighRisk', 901.4),
(24 ,NULL, 8, 1, 7, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Bruce Wayne', 1, NULL, 0, 5, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4), (24 ,NULL, 8, 1, 7, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Bruce Wayne', 1, NULL, 0, 5, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4),
(25 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Bruce Wayne', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4), (25 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Bruce Wayne', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4),
(26 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'An incredibly long alias for testing purposes', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4), (26 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'An incredibly long alias for testing purposes', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, 'hasHighRisk', 901.4),
(27 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Wolverine', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 901.4), (27 ,NULL, 8, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1101, 'Wolverine', 1, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, 901.4),
(28, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (28, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(29, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (29, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(30, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (30, 1, 8, 1, 1, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(31, 1, 8, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL + 2 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (31, 1, 8, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL + 2 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(32, 1, 8, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL + 2 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (32, 1, 8, 1, 1, DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL + 2 DAY), 1103, 'Phone Box', 123, NULL, 0, 1, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(33, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'NY roofs', 122, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (33, 1, 7, 1, 6, util.VN_CURDATE(), DATE_ADD(util.VN_CURDATE(), INTERVAL + 1 DAY), 1102, 'NY roofs', 122, NULL, 0, 3, 5, 1, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(34, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1103, 'BEJAR', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (34, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1103, 'BEJAR', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(35, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1102, 'Somewhere in Philippines', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (35, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1102, 'Somewhere in Philippines', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(36, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1102, 'Ant-Man Adventure', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL), (36, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1102, 'Ant-Man Adventure', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(37, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1110, 'Deadpool swords', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL); (37, 1, 1, 1, 3, util.VN_CURDATE(), util.VN_CURDATE(), 1110, 'Deadpool swords', 123, NULL, 0, 1, 16, 0, util.VN_CURDATE(), NULL, NULL, NULL, NULL),
(1000000, NULL, 1, 1, NULL, util.VN_CURDATE(), util.VN_CURDATE(), 1, 'employee', 131, NULL, 0, 1, 1.00, 0.00, CURDATE(), NULL, NULL, '', NULL);
INSERT INTO `vn`.`ticketObservation`(`id`, `ticketFk`, `observationTypeFk`, `description`) INSERT INTO `vn`.`ticketObservation`(`id`, `ticketFk`, `observationTypeFk`, `description`)
VALUES VALUES
@ -923,17 +924,18 @@ INSERT INTO `vn`.`itemType`(`id`, `code`, `name`, `categoryFk`, `life`, `workerF
(5, 'CON', 'Container', 3, NULL, 35, 1, 'warm', 0), (5, 'CON', 'Container', 3, NULL, 35, 1, 'warm', 0),
(6, 'ALS', 'Alstroemeria', 1, 31, 16, 0, 'warm', 1); (6, 'ALS', 'Alstroemeria', 1, 31, 16, 0, 'warm', 1);
INSERT INTO `vn`.`ink`(`id`, `name`, `picture`, `showOrder`, `hex`) INSERT INTO `vn`.`ink`(`id`, `name`, `picture`, `showOrder`, `hex`, `hexJson`)
VALUES VALUES
('YEL', 'Yellow', 1, 1, 'F4D03F'), ('YEL', 'Yellow', 1, 1, 'F4D03F', '{"value": ["F4D03F"]}'),
('BLU', 'Blue', 1, 2, '5DADE2'), ('BLU', 'Blue', 1, 2, '5DADE2', '{"value": ["5DADE2"]}'),
('RED', 'Red', 1, 3, 'EC7063'), ('RED', 'Red', 1, 3, 'EC7063', '{"value": ["EC7063"]}'),
('SLV', 'Silver', 1, 4, 'CACFD2'), ('SLV', 'Silver', 1, 4, 'CACFD2', '{"value": ["CACFD2"]}'),
('BRW', 'Brown', 1, 5, 'DC7633'), ('BRW', 'Brown', 1, 5, 'DC7633', '{"value": ["DC7633"]}'),
('BLK', 'Black', 1, 6, '000000'), ('BLK', 'Black', 1, 6, '000000', '{"value": ["000000"]}'),
('BAS', 'Blue/Silver', 1, 7, '5DADE2'), ('BAS', 'Blue/Silver', 1, 7, '5DADE2', '{"value": ["5DADE2"]}'),
('GRN', 'Green', 1, 8, '28A745'), ('GRN', 'Green', 1, 8, '28A745', '{"value": ["28A745"]}'),
('WHT', 'White', 1, 9, 'FFFFFF'); ('WHT', 'White', 1, 9, 'FFFFFF', '{"value": ["FFFFFF"]}'),
('RGB', 'Red/Green/Blue', 1, 9, 'FFFFFF', '{"value": ["EC7063","5DADE2","28A745"]}');
INSERT INTO `vn`.`origin`(`id`,`code`, `name`) INSERT INTO `vn`.`origin`(`id`,`code`, `name`)
VALUES VALUES
@ -981,28 +983,30 @@ INSERT INTO `vn`.`itemFamily`(`code`, `description`)
('VT', 'Sales'); ('VT', 'Sales');
INSERT INTO `vn`.`item`( INSERT INTO `vn`.`item`(
`id`, `typeFk`, `stems`, `originFk`, `description`, `producerFk`, `intrastatFk`, `expenseFk`, `id`, `typeFk`, `stems`, `originFk`, `description`, `producerFk`, `intrastatFk`, `expenseFk`,
`comment`, `relevancy`, `image`, `subName`, `minPrice`, `family`, `isFloramondo`, `genericFk`, `comment`, `relevancy`, `image`, `subName`, `minPrice`, `family`, `isFloramondo`, `genericFk`,
`itemPackingTypeFk`, `hasMinPrice`, `weightByPiece`, `isCustomInspectionRequired` `itemPackingTypeFk`, `hasMinPrice`, `packingOut`, `weightByPiece`, `isCustomInspectionRequired`
) )
VALUES VALUES
(1, 2, 1, 1, NULL, 1, 06021010, 2000000000, NULL, 0, '1', NULL, 0, 'EMB', 0, NULL, 'V', 0, 3, 1), (1, 2, 1, 1, NULL, 1, 06021010, 2000000000, NULL, 0, '1', NULL, 0, 'EMB', 0, NULL, 'V', 0, NULL, 3, 1),
(2, 2, 1, 2, NULL, 1, 06021010, 2000000000, NULL, 0, '2', NULL, 0, 'VT', 0, NULL, 'H', 0, 2, 1), (2, 2, 1, 2, NULL, 1, 06021010, 2000000000, NULL, 0, '2', NULL, 0, 'VT', 0, NULL, 'H', 0, NULL, 2, 1),
(3, 1, 1, 3, NULL, 1, 05080000, 4751000000, NULL, 0, '3', NULL, 0, 'VT', 0, NULL, NULL, 0, 5, 0), (3, 1, 1, 3, NULL, 1, 05080000, 4751000000, NULL, 0, '3', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 5, 0),
(4, 1, 1, 1, 'Increases block', 1, 05080000, 4751000000, NULL, 0, '4', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (4, 1, 1, 1, 'Increases block', 1, 05080000, 4751000000, NULL, 0, '4', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(5, 3, 1, 2, NULL, 2, 06021010, 4751000000, NULL, 0, '5', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (5, 3, 1, 2, NULL, 2, 06021010, 4751000000, NULL, 0, '5', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(6, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '6', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (6, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '6', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(7, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '7', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (7, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '7', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(8, 2, 1, 1, NULL, 1, 06021010, 2000000000, NULL, 0, '8', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (8, 2, 1, 1, NULL, 1, 06021010, 2000000000, NULL, 0, '8', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(9, 2, 1, 2, NULL, 1, 06021010, 2000000000, NULL, 0, '9', NULL, 0, 'VT', 1, NULL, NULL, 0, NULL, 0), (9, 2, 1, 2, NULL, 1, 06021010, 2000000000, NULL, 0, '9', NULL, 0, 'VT', 1, NULL, NULL, 0, NULL, NULL, 0),
(10, 1, 1, 3, NULL, 1, 05080000, 4751000000, NULL, 0, '10', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (10, 1, 1, 3, NULL, 1, 05080000, 4751000000, NULL, 0, '10', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(11, 1, 1, 1, NULL, 1, 05080000, 4751000000, NULL, 0, '11', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (11, 1, 1, 1, NULL, 1, 05080000, 4751000000, NULL, 0, '11', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(12, 3, 1, 2, NULL, 2, 06021010, 4751000000, NULL, 0, '12', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0), (12, 3, 1, 2, NULL, 2, 06021010, 4751000000, NULL, 0, '12', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(13, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '13', NULL, 1, 'VT', 1, NULL, NULL, 1, NULL, 0), (13, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '13', NULL, 1, 'VT', 1, NULL, NULL, 1, NULL, NULL, 0),
(14, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'VT', 1, NULL, NULL, 0, NULL, 0), (14, 5, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'VT', 1, NULL, NULL, 0, NULL, NULL, 0),
(15, 4, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'EMB', 0, NULL, NULL, 0, NULL, 0), (15, 4, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'EMB', 0, NULL, NULL, 0, NULL, NULL, 0),
(16, 6, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'EMB', 0, NULL, NULL, 0, NULL, 0), (16, 6, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'EMB', 0, NULL, NULL, 0, NULL, NULL, 0),
(71, 6, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, 0); (71, 6, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, '', NULL, 0, 'VT', 0, NULL, NULL, 0, NULL, NULL, 0),
(72, 6, NULL, 1, NULL, NULL, 06021010, 4751000000, NULL, 0, NULL, NULL, 1, 'VT', 0, NULL, NULL, 1, 1, NULL, 0),
(88, 1, 1, 2, NULL, NULL, 06021010, 4751000000, NULL, 0, NULL, NULL,10, 'VT', 0, NULL, NULL, 1, NULL, NULL, 0);
-- Update the taxClass after insert of the items -- Update the taxClass after insert of the items
@ -1125,7 +1129,8 @@ INSERT INTO `vn`.`sale`(`id`, `itemFk`, `ticketFk`, `concept`, `quantity`, `pric
(39, 1, 32, 'Ranged weapon longbow 200cm', 2, 103.49, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack'), (39, 1, 32, 'Ranged weapon longbow 200cm', 2, 103.49, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack'),
(40, 2, 34, 'Melee weapon combat fist 15cm', 10.00, 3.91, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasItemLost'), (40, 2, 34, 'Melee weapon combat fist 15cm', 10.00, 3.91, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasItemLost'),
(41, 2, 35, 'Melee weapon combat fist 15cm', 8.00, 3.01, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasRounding,hasItemLost'), (41, 2, 35, 'Melee weapon combat fist 15cm', 8.00, 3.01, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasRounding,hasItemLost'),
(42, 2, 36, 'Melee weapon combat fist 15cm', 6.00, 2.50, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasRounding,hasItemLost'); (42, 2, 36, 'Melee weapon combat fist 15cm', 6.00, 2.50, 0, 0, 0, util.VN_CURDATE(), 'hasComponentLack,hasRounding,hasItemLost'),
(43, 88, 1000000, 'Chest medical box 2', 15.00, 10.00, 0, 0, 0, CURDATE(), '');
INSERT INTO `vn`.`saleComponent`(`saleFk`, `componentFk`, `value`) INSERT INTO `vn`.`saleComponent`(`saleFk`, `componentFk`, `value`)
VALUES VALUES
@ -1356,108 +1361,122 @@ INSERT INTO `vn`.`tag`(`id`, `code`, `name`, `isFree`, `isQuantitatif`, `sourceT
(92, NULL, 'Nombre temporal', 1, 0, NULL, NULL, NULL, NULL); (92, NULL, 'Nombre temporal', 1, 0, NULL, NULL, NULL, NULL);
INSERT INTO `vn`.`itemTag`(`id`,`itemFk`,`tagFk`,`value`,`priority`) INSERT INTO `vn`.`itemTag`(`id`,`itemFk`,`tagFk`,`value`,`priority`)
VALUES VALUES
(1, 1, 56, 'Ranged weapon', 1), (1, 1, 56, 'Ranged weapon', 1),
(2, 1, 58, 'longbow', 2), (2, 1, 58, 'longbow', 2),
(3, 1, 27, '200cm', 3), (3, 1, 27, '200cm', 3),
(4, 1, 36, 'Stark Industries', 4), (4, 1, 36, 'Stark Industries', 4),
(5, 1, 1, 'Brown', 5), (5, 1, 1, 'Brown', 5),
(6, 1, 67, '+1 precission', 6), (6, 1, 67, '+1 precission', 6),
(7, 1, 23, '1', 7), (7, 1, 23, '1', 7),
(8, 2, 56, 'Melee weapon', 1), (8, 2, 56, 'Melee weapon', 1),
(9, 2, 58, 'combat fist', 2), (9, 2, 58, 'combat fist', 2),
(10, 2, 27, '15cm', 3), (10, 2, 27, '15cm', 3),
(11, 2, 36, 'Stark Industries', 4), (11, 2, 36, 'Stark Industries', 4),
(12, 2, 1, 'Silver', 5), (12, 2, 1, 'Silver', 5),
(13, 2, 67, 'Concussion', 6), (13, 2, 67, 'Concussion', 6),
(14, 2, 23, '2', 7), (14, 2, 23, '2', 7),
(15, 3, 56, 'Ranged weapon', 1), (15, 3, 56, 'Ranged weapon', 1),
(16, 3, 58, 'sniper rifle', 2), (16, 3, 58, 'sniper rifle', 2),
(17, 3, 4, '113cm', 3), (17, 3, 4, '113cm', 3),
(18, 3, 36, 'Stark Industries', 4), (18, 3, 36, 'Stark Industries', 4),
(19, 3, 1, 'Green', 5), (19, 3, 1, 'Green', 5),
(20, 3, 67, 'precission', 6), (20, 3, 67, 'precission', 6),
(21, 3, 23, '3', 7), (21, 3, 23, '3', 7),
(22, 4, 56, 'Melee weapon', 1), (22, 4, 56, 'Melee weapon', 1),
(23, 4, 58, 'heavy shield', 2), (23, 4, 58, 'heavy shield', 2),
(24, 4, 4, '100cm', 3), (24, 4, 4, '100cm', 3),
(25, 4, 36, 'Stark Industries', 4), (25, 4, 36, 'Stark Industries', 4),
(26, 4, 1, 'Black', 5), (26, 4, 1, 'Black', 5),
(27, 4, 67, 'containtment', 6), (27, 4, 67, 'containtment', 6),
(28, 4, 23, '4', 7), (28, 4, 23, '4', 7),
(29, 5, 56, 'Ranged weapon', 1), (29, 5, 56, 'Ranged weapon', 1),
(30, 5, 58, 'pistol', 2), (30, 5, 58, 'pistol', 2),
(31, 5, 67, '9mm', 3), (31, 5, 67, '9mm', 3),
(32, 5, 36, 'Stark Industries', 4), (32, 5, 36, 'Stark Industries', 4),
(33, 5, 1, 'Silver', 5), (33, 5, 1, 'Silver', 5),
(34, 5, 27, '15cm', 6), (34, 5, 27, '15cm', 6),
(35, 5, 23, '5', 7), (35, 5, 23, '5', 7),
(36, 6, 56, 'Container', 1), (36, 6, 56, 'Container', 1),
(37, 6, 58, 'ammo box', 2), (37, 6, 58, 'ammo box', 2),
(38, 6, 27, '100cm', 3), (38, 6, 27, '100cm', 3),
(39, 6, 36, 'Stark Industries', 4), (39, 6, 36, 'Stark Industries', 4),
(40, 6, 1, 'Green', 5), (40, 6, 1, 'Green', 5),
(41, 6, 67, 'supply', 6), (41, 6, 67, 'supply', 6),
(42, 6, 23, '6', 7), (42, 6, 23, '6', 7),
(43, 7, 56, 'Container', 1), (43, 7, 56, 'Container', 1),
(44, 7, 58, 'medical box', 2), (44, 7, 58, 'medical box', 2),
(45, 7, 27, '100cm', 3), (45, 7, 27, '100cm', 3),
(46, 7, 36, 'Stark Industries', 4), (46, 7, 36, 'Stark Industries', 4),
(47, 7, 1, 'White', 5), (47, 7, 1, 'White', 5),
(48, 7, 67, 'supply', 6), (48, 7, 67, 'supply', 6),
(49, 7, 23, '7', 7), (49, 7, 23, '7', 7),
(50, 8, 56, 'Ranged Reinforced weapon', 1), (50, 8, 56, 'Ranged Reinforced weapon', 1),
(51, 8, 58, '+1 longbow', 2), (51, 8, 58, '+1 longbow', 2),
(52, 8, 27, '200cm', 3), (52, 8, 27, '200cm', 3),
(53, 8, 36, 'Stark Industries', 4), (53, 8, 36, 'Stark Industries', 4),
(54, 8, 1, 'Brown', 5), (54, 8, 1, 'Brown', 5),
(55, 8, 67, 'precission', 6), (55, 8, 67, 'precission', 6),
(56, 8, 23, '8', 7), (56, 8, 23, '8', 7),
(57, 9, 56, 'Melee Reinforced weapon', 1), (57, 9, 56, 'Melee Reinforced weapon', 1),
(58, 9, 58, 'combat fist', 2), (58, 9, 58, 'combat fist', 2),
(59, 9, 27, '15cm', 3), (59, 9, 27, '15cm', 3),
(60, 9, 36, 'Stark Industries', 4), (60, 9, 36, 'Stark Industries', 4),
(61, 9, 1, 'Silver', 5), (61, 9, 1, 'Silver', 5),
(62, 9, 67, 'Concussion', 6), (62, 9, 67, 'Concussion', 6),
(63, 9, 23, '9', 7), (63, 9, 23, '9', 7),
(64, 10, 56, 'Ranged Reinforced weapon', 1), (64, 10, 56, 'Ranged Reinforced weapon', 1),
(65, 10, 58, 'sniper rifle', 2), (65, 10, 58, 'sniper rifle', 2),
(66, 10, 67, '700mm', 3), (66, 10, 67, '700mm', 3),
(67, 10, 36, 'Stark Industries', 4), (67, 10, 36, 'Stark Industries', 4),
(68, 10, 1, 'Green', 5), (68, 10, 1, 'Green', 5),
(69, 10, 27, '130cm', 6), (69, 10, 27, '130cm', 6),
(70, 10, 23, '10', 7), (70, 10, 23, '10', 7),
(71, 11, 56, 'Melee Reinforced weapon', 1), (71, 11, 56, 'Melee Reinforced weapon', 1),
(72, 11, 58, 'heavy shield', 2), (72, 11, 58, 'heavy shield', 2),
(73, 11, 4, '120cm', 3), (73, 11, 4, '120cm', 3),
(74, 11, 36, 'Stark Industries', 4), (74, 11, 36, 'Stark Industries', 4),
(75, 11, 1, 'Black', 5), (75, 11, 1, 'Black', 5),
(76, 11, 67, 'containtment', 6), (76, 11, 67, 'containtment', 6),
(77, 11, 23, '11', 7), (77, 11, 23, '11', 7),
(78, 12, 56, 'Ranged Reinforced weapon', 1), (78, 12, 56, 'Ranged Reinforced weapon', 1),
(79, 12, 58, 'pistol', 2), (79, 12, 58, 'pistol', 2),
(80, 12, 27, '9mm', 3), (80, 12, 27, '9mm', 3),
(81, 12, 36, 'Stark Industries', 4), (81, 12, 36, 'Stark Industries', 4),
(82, 12, 1, 'Silver', 5), (82, 12, 1, 'Silver', 5),
(83, 12, 67, '23cm', 6), (83, 12, 67, '23cm', 6),
(84, 12, 23, '12', 7), (84, 12, 23, '12', 7),
(85, 13, 56, 'Chest', 1), (85, 13, 56, 'Chest', 1),
(86, 13, 58, 'ammo box', 2), (86, 13, 58, 'ammo box', 2),
(87, 13, 27, '100cm', 3), (87, 13, 27, '100cm', 3),
(88, 13, 36, 'Stark Industries', 4), (88, 13, 36, 'Stark Industries', 4),
(89, 13, 1, 'Green', 5), (89, 13, 1, 'Green', 5),
(90, 13, 67, 'supply', 6), (90, 13, 67, 'supply', 6),
(91, 13, 23, '13', 7), (91, 13, 23, '13', 7),
(92, 14, 56, 'Chest', 1), (92, 14, 56, 'Chest', 1),
(93, 14, 58, 'medical box', 2), (93, 14, 58, 'medical box', 2),
(94, 14, 27, '100cm', 3), (94, 14, 27, '100cm', 3),
(95, 14, 36, 'Stark Industries', 4), (95, 14, 36, 'Stark Industries', 4),
(96, 14, 1, 'White', 5), (96, 14, 1, 'White', 5),
(97, 14, 67, 'supply', 6), (97, 14, 67, 'supply', 6),
(98, 14, 23, '1', 7), (98, 14, 23, '1', 7),
(99, 15, 92, 'Trolley', 2), (99, 15, 92, 'Trolley', 2),
(100, 16, 92, 'Pallet', 2), (100, 16, 92, 'Pallet', 2),
(101, 71, 92, 'Shipping cost', 2); (101, 71, 92, 'Shipping cost', 2),
(102, 88, 56, 'Chest', 1),
(103, 88, 58, 'ammo box', 2),
(104, 88, 27, '100cm', 3),
(105, 88, 36, 'Stark Industries', 4),
(106, 88, 1, 'White', 5),
(107, 88, 67, 'supply', 6),
(108, 88, 23, '13', 7),
(109, 72, 56, 'Mistic weapon', 1),
(110, 72, 58, 'Stormbreaker', 2),
(111, 72, 27, '200cm', 3),
(112, 72, 36, 'Stark Industries', 4),
(113, 72, 1, 'Red/Green/Blue', 5),
(114, 72, 67, '-1 precission', 6),
(115, 72, 23, '1', 7);
INSERT INTO `vn`.`itemTypeTag`(`id`, `itemTypeFk`, `tagFk`, `priority`) INSERT INTO `vn`.`itemTypeTag`(`id`, `itemTypeFk`, `tagFk`, `priority`)
VALUES VALUES
@ -1527,7 +1546,8 @@ INSERT INTO `vn`.`travel`(`id`,`shipped`, `landed`, `warehouseInFk`, `warehouseO
(10, DATE_ADD(util.VN_CURDATE(), INTERVAL +5 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL +5 DAY), 5, 1, 1, 50.00, 500, 'nineth travel', 1, 2, 10, TRUE, 2), (10, DATE_ADD(util.VN_CURDATE(), INTERVAL +5 DAY), DATE_ADD(util.VN_CURDATE(), INTERVAL +5 DAY), 5, 1, 1, 50.00, 500, 'nineth travel', 1, 2, 10, TRUE, 2),
(11, util.VN_CURDATE() - INTERVAL 1 DAY , util.VN_CURDATE(), 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL), (11, util.VN_CURDATE() - INTERVAL 1 DAY , util.VN_CURDATE(), 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL),
(12, util.VN_CURDATE() , util.VN_CURDATE() + INTERVAL 1 DAY, 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL), (12, util.VN_CURDATE() , util.VN_CURDATE() + INTERVAL 1 DAY, 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL),
(13, util.VN_CURDATE() - INTERVAL 1 MONTH - INTERVAL 1 DAY, util.VN_CURDATE() - INTERVAL 1 MONTH, 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL); (13, util.VN_CURDATE() - INTERVAL 1 MONTH - INTERVAL 1 DAY, util.VN_CURDATE() - INTERVAL 1 MONTH, 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL),
(14, util.VN_CURDATE() - INTERVAL 1 DAY , util.VN_CURDATE() + INTERVAL 1 DAY, 6, 3, 0, 50.00, 500, 'eleventh travel', 1, 2, 4, FALSE, NULL);
INSERT INTO `vn`.`entry`(`id`, `supplierFk`, `created`, `travelFk`, `isConfirmed`, `companyFk`, `invoiceNumber`, `reference`, `isExcludedFromAvailable`, `evaNotes`, `typeFk`) INSERT INTO `vn`.`entry`(`id`, `supplierFk`, `created`, `travelFk`, `isConfirmed`, `companyFk`, `invoiceNumber`, `reference`, `isExcludedFromAvailable`, `evaNotes`, `typeFk`)
VALUES VALUES
@ -1543,10 +1563,11 @@ INSERT INTO `vn`.`entry`(`id`, `supplierFk`, `created`, `travelFk`, `isConfirmed
(10, 2, DATE_ADD(util.VN_CURDATE(), INTERVAL +2 DAY), 10, 0, 442, 'IN2010', 'Movement 10',1, '', 'product'), (10, 2, DATE_ADD(util.VN_CURDATE(), INTERVAL +2 DAY), 10, 0, 442, 'IN2010', 'Movement 10',1, '', 'product'),
(11, 4, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), 1, 1, 442, 'IN2011', 'Movement 11',0, '', 'product'), (11, 4, DATE_ADD(util.VN_CURDATE(), INTERVAL -2 MONTH), 1, 1, 442, 'IN2011', 'Movement 11',0, '', 'product'),
(12, 4, util.VN_CURDATE() - INTERVAL 1 MONTH, 13, 1, 442, 'IN2012', 'Movement 12',0, '', 'product'), (12, 4, util.VN_CURDATE() - INTERVAL 1 MONTH, 13, 1, 442, 'IN2012', 'Movement 12',0, '', 'product'),
(99, 69, util.VN_CURDATE() - INTERVAL 1 MONTH, 11, 0, 442, 'IN2009', 'Movement 99',0, '', 'product'); (99, 69, util.VN_CURDATE() - INTERVAL 1 MONTH, 11, 0, 442, 'IN2009', 'Movement 99',0, '', 'product'),
(100, 1, util.VN_CURDATE() , 14, 0, 442, 'IN2009','Movement 100',0, '', 'product');
INSERT INTO `vn`.`entryConfig` (`defaultEntry`, `inventorySupplierFk`, `defaultSupplierFk`) INSERT INTO `vn`.`entryConfig` (`defaultEntry`, `inventorySupplierFk`, `maxLockTime`, `defaultSupplierFk`)
VALUES (2, 4, 1); VALUES (2, 4, 300, 1);
INSERT INTO `bs`.`waste`(`buyerFk`, `year`, `week`, `itemFk`, `itemTypeFk`, `saleTotal`, `saleWasteQuantity`, `saleExternalWaste`, `saleFaultWaste`, `saleContainerWaste`, `saleBreakWaste`, `saleOtherWaste`) INSERT INTO `bs`.`waste`(`buyerFk`, `year`, `week`, `itemFk`, `itemTypeFk`, `saleTotal`, `saleWasteQuantity`, `saleExternalWaste`, `saleFaultWaste`, `saleContainerWaste`, `saleBreakWaste`, `saleOtherWaste`)
VALUES VALUES
@ -1566,26 +1587,35 @@ INSERT INTO `bs`.`waste`(`buyerFk`, `year`, `week`, `itemFk`, `itemTypeFk`, `sal
('103', YEAR(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK)), WEEK(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK), 1), 6, 1, '186', '0', '51', '53.12', '56.20', '56.20', '56.20'), ('103', YEAR(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK)), WEEK(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK), 1), 6, 1, '186', '0', '51', '53.12', '56.20', '56.20', '56.20'),
('103', YEAR(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK)), WEEK(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK), 1), 7, 1, '277', '0', '53.12', '56.20', '56.20', '56.20', '56.20'); ('103', YEAR(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK)), WEEK(DATE_ADD(util.VN_CURDATE(), INTERVAL -1 WEEK), 1), 7, 1, '277', '0', '53.12', '56.20', '56.20', '56.20', '56.20');
INSERT INTO vn.buy(id,entryFk,itemFk,buyingValue,quantity,packagingFk,stickers,freightValue,packageValue,comissionValue,packing,grouping,groupingMode,location,price1,price2,price3,printedStickers,isChecked,isIgnored,weight,created) INSERT INTO edi.supplier (supplier_id,company_name,entry_date,expiry_date,change_date_time,isAllowedDirectSales,isBanned)
VALUES VALUES (1,'MV', util.VN_CURDATE(), util.VN_CURDATE(), util.VN_CURDATE(), 1, 0);
(1, 1, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 1, util.VN_CURDATE() - INTERVAL 2 MONTH), INSERT INTO edi.ekt (id,`ref`,qty,pro,pri,ok,scanned)
(2, 2, 1, 50, 100, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 1, util.VN_CURDATE() - INTERVAL 1 MONTH), VALUES (1, 1234, 1, 1, 1.1, 1, 1);
(3, 3, 1, 50, 100, 4, 1, 1.500, 1.500, 0.000, 1, 1, NULL, NULL, 0.00, 99.6, 99.4, 0, 1, 0, 1, util.VN_CURDATE()),
(4, 2, 2, 5, 450, 3, 1, 1.000, 1.000, 0.000, 10, 10, NULL, NULL, 0.00, 7.30, 7.00, 0, 1, 0, 2.5, util.VN_CURDATE()), INSERT INTO vn.buy(id,entryFk,itemFk,buyingValue,quantity,packagingFk,stickers,freightValue,packageValue,comissionValue,packing,grouping,groupingMode,location,price1,price2,price3,printedStickers,isChecked,isIgnored,ektFk,weight,created)
(5, 3, 3, 55, 500, 5, 1, 1.000, 1.000, 0.000, 1, 1, NULL, NULL, 0.00, 78.3, 75.6, 0, 1, 0, 2.5, util.VN_CURDATE()), VALUES
(6, 4, 8, 50, 1000, 4, 1, 1.000, 1.000, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 2.5, util.VN_CURDATE()), ( 1, 1, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE() - INTERVAL 2 MONTH),
(7, 4, 9, 20, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'packing', NULL, 0.00, 30.50, 29.00, 0, 1, 0, 2.5, util.VN_CURDATE()), ( 2, 2, 1, 50, 100, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE() - INTERVAL 1 MONTH),
(8, 4, 4, 1.25, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, 2.5, util.VN_CURDATE()), ( 3, 3, 1, 50, 100, 4, 1, 1.500, 1.500, 0.000, 1, 1, NULL, NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE()),
(9, 4, 4, 1.25, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, 4, util.VN_CURDATE()), ( 4, 2, 2, 5, 450, 3, 1, 1.000, 1.000, 0.000, 10, 10, NULL, NULL, 0.00, 7.30, 7.00, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(10, 5, 1, 50, 10, 4, 1, 2.500, 2.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 4, util.VN_CURDATE()), ( 5, 3, 3, 55, 500, 5, 1, 1.000, 1.000, 0.000, 1, 1, NULL, NULL, 0.00, 78.3, 75.6, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(11, 5, 4, 1.25, 10, 3, 1, 2.500, 2.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, 4, util.VN_CURDATE()), ( 6, 4, 8, 50, 1000, 4, 1, 1.000, 1.000, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(12, 6, 4, 1.25, 0, 3, 1, 2.500, 2.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, 4, util.VN_CURDATE()), ( 7, 4, 9, 20, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'packing', NULL, 0.00, 30.50, 29.00, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(13, 7, 1, 50, 0, 3, 1, 2.000, 2.000, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 4, util.VN_CURDATE()), ( 8, 4, 4, 1.25, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(14, 7, 2, 5, 0, 3, 1, 2.000, 2.000, 0.000, 10, 10, 'grouping', NULL, 0.00, 7.30, 7.00, 0, 1, 0, 4, util.VN_CURDATE()), ( 9, 4, 4, 1.25, 1000, 3, 1, 0.500, 0.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(15, 7, 4, 1.25, 0, 3, 1, 2.000, 2.000, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, 4, util.VN_CURDATE()), (10, 5, 1, 50, 10, 4, 1, 2.500, 2.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(16, 99,1,50.0000, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.60, 99.40, 0, 1, 0, 1.00, '2024-07-30 08:13:51.000'), (11, 5, 4, 1.25, 10, 3, 1, 2.500, 2.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(17, 11, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 1, util.VN_CURDATE() - INTERVAL 2 MONTH), (12, 6, 4, 1.25, 0, 3, 1, 2.500, 2.500, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(18, 12, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.6, 99.4, 0, 1, 0, 1, util.VN_CURDATE() - INTERVAL 2 MONTH); (13, 7, 1, 50, 0, 3, 1, 2.000, 2.000, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(14, 7, 2, 5, 0, 3, 1, 2.000, 2.000, 0.000, 10, 10, 'grouping', NULL, 0.00, 7.30, 7.00, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(15, 7, 4, 1.25, 0, 3, 1, 2.000, 2.000, 0.000, 10, 10, 'grouping', NULL, 0.00, 1.75, 1.67, 0, 1, 0, NULL, 4, util.VN_CURDATE()),
(16, 99, 1, 50.0000, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.60, 99.40, 0, 1, 0, NULL, 1.00, '2024-07-30 08:13:51.000'),
(17, 11, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'packing', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE() - INTERVAL 2 MONTH),
(18, 12, 1, 50, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE() - INTERVAL 2 MONTH),
(19, 100, 1, 50, 100, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.6, 99.4, 0, 1, 0, NULL, 1, util.VN_CURDATE()),
(20, 100, 2, 5, 450, 3, 2, 1.000, 1.000, 0.000, 10, 10, NULL, NULL, 0.00, 7.30, 7.00, 0, 1, 0, NULL, 2.5, util.VN_CURDATE()),
(21, 100,72, 55, 500, 5, 3, 1.000, 1.000, 0.000, 1, 1, 'packing', NULL, 0.00, 78.3, 75.6, 0, 1, 0, 1, 3, util.VN_CURDATE()),
(10000002, 12,88, 50.0000, 5000, 4, 1, 1.500, 1.500, 0.000, 1, 1, 'grouping', NULL, 0.00, 99.60, 99.40, 0, 1, 0, 1.00, 1,util.VN_CURDATE() - INTERVAL 2 MONTH);
INSERT INTO `hedera`.`order`(`id`, `date_send`, `customer_id`, `delivery_method_id`, `agency_id`, `address_id`, `company_id`, `note`, `source_app`, `confirmed`,`total`, `date_make`, `first_row_stamp`, `confirm_date`) INSERT INTO `hedera`.`order`(`id`, `date_send`, `customer_id`, `delivery_method_id`, `agency_id`, `address_id`, `company_id`, `note`, `source_app`, `confirmed`,`total`, `date_make`, `first_row_stamp`, `confirm_date`)
VALUES VALUES
@ -2755,11 +2785,11 @@ INSERT INTO `vn`.`roadmapAddress` (`addressFk`)
(3), (3),
(4); (4);
INSERT INTO `vn`.`roadmap` (`id`, `name`, `tractorPlate`, `trailerPlate`, `phone`, `supplierFk`, `etd`, `eta`, `observations`, `editorFk`, `price`, `driverName`) INSERT INTO `vn`.`roadmap` (`id`, `name`, `tractorPlate`, `trailerPlate`, `phone`, `supplierFk`, `etd`, `observations`, `editorFk`, `price`, `driverName`)
VALUES VALUES
(1, 'val-algemesi', '1234-BCD', '9876-BCD', '111111111', 1, util.VN_NOW(), DATE_ADD(util.VN_NOW(), INTERVAL 2 DAY), 'this is test observation', 1, 15, 'Batman'), (1, 'val-algemesi', '1234-BCD', '9876-BCD', '111111111', 1, util.VN_NOW(), 'this is test observation', 1, 15, 'Batman'),
(2, 'alg-valencia', '2345-CDF', '8765-BCD', '111111111', 1, util.VN_NOW(), DATE_ADD(util.VN_NOW(), INTERVAL 5 DAY), 'test observation', 1, 20, 'Robin'), (2, 'alg-valencia', '2345-CDF', '8765-BCD', '111111111', 1, util.VN_NOW(), 'test observation', 1, 20, 'Robin'),
(3, 'alz-algemesi', '3456-DFG', '7654-BCD', '222222222', 2, DATE_ADD(util.VN_NOW(), INTERVAL 3 DAY), DATE_ADD(util.VN_NOW(), INTERVAL 6 DAY), 'observations...', 2, 25, 'Driverman'); (3, 'alz-algemesi', '3456-DFG', '7654-BCD', '222222222', 2, DATE_ADD(util.VN_NOW(), INTERVAL 3 DAY), 'observations...', 2, 25, 'Driverman');
INSERT INTO `vn`.`roadmapStop` (`id`, `roadmapFk`, `roadmapAddressFk`, `eta`, `description`, `editorFk`) INSERT INTO `vn`.`roadmapStop` (`id`, `roadmapFk`, `roadmapAddressFk`, `eta`, `description`, `editorFk`)
VALUES VALUES
@ -4111,4 +4141,4 @@ INSERT IGNORE INTO vn.vehicleType (id, name)
VALUES (1,'vehículo empresa'), VALUES (1,'vehículo empresa'),
(2, 'furgoneta'), (2, 'furgoneta'),
(3, 'cabeza tractora'), (3, 'cabeza tractora'),
(4, 'remolque'); (4, 'remolque');

View File

@ -89,12 +89,12 @@ proc: BEGIN
AND (ir.ended IS NULL OR i.shipped <= ir.ended) AND (ir.ended IS NULL OR i.shipped <= ir.ended)
AND i.warehouseFk = vWarehouse AND i.warehouseFk = vWarehouse
UNION ALL UNION ALL
SELECT i.itemFk, i.landed, i.quantity SELECT i.itemFk, IFNULL(i.availabled, i.landed), i.quantity
FROM vn.itemEntryIn i FROM vn.itemEntryIn i
JOIN itemRange ir ON ir.itemFk = i.itemFk JOIN itemRange ir ON ir.itemFk = i.itemFk
WHERE i.landed >= vStartDate WHERE IFNULL(i.availabled, i.landed) >= vStartDate
AND IFNULL(i.availabled, i.landed) <= vAvailabled AND IFNULL(i.availabled, i.landed) <= vAvailabled
AND (ir.ended IS NULL OR i.landed <= ir.ended) AND (ir.ended IS NULL OR IFNULL(i.availabled, i.landed) <= ir.ended)
AND i.warehouseInFk = vWarehouse AND i.warehouseInFk = vWarehouse
UNION ALL UNION ALL
SELECT i.itemFk, i.shipped, i.quantity SELECT i.itemFk, i.shipped, i.quantity

View File

@ -160,9 +160,11 @@ BEGIN
OR (NOT s.isPreparable AND NOT s.isPrintable) OR (NOT s.isPreparable AND NOT s.isPrintable)
OR pb.collectionH IS NOT NULL OR pb.collectionH IS NOT NULL
OR pb.collectionV IS NOT NULL OR pb.collectionV IS NOT NULL
OR pb.collectionA IS NOT NULL
OR pb.collectionN IS NOT NULL OR pb.collectionN IS NOT NULL
OR (NOT pb.H AND pb.V > 0 AND vItemPackingTypeFk = 'H') OR (NOT pb.H AND pb.V + pb.A > 0 AND vItemPackingTypeFk = 'H')
OR (NOT pb.V AND vItemPackingTypeFk = 'V') OR (NOT pb.V AND vItemPackingTypeFk = 'V')
OR (NOT pb.A AND vItemPackingTypeFk = 'A')
OR (pc.isPreviousPreparationRequired AND pb.previousWithoutParking) OR (pc.isPreviousPreparationRequired AND pb.previousWithoutParking)
OR LENGTH(pb.problem) OR LENGTH(pb.problem)
OR pb.lines > vLinesLimit OR pb.lines > vLinesLimit

View File

@ -30,7 +30,7 @@ BEGIN
WITH entriesIn AS ( WITH entriesIn AS (
SELECT 'entry' originType, SELECT 'entry' originType,
e.id originId, e.id originId,
tr.landed shipped, IFNULL(tr.availabled, tr.landed) shipped,
b.quantity `in`, b.quantity `in`,
NULL `out`, NULL `out`,
st.alertLevel , st.alertLevel ,
@ -54,7 +54,7 @@ BEGIN
OR (util.VN_CURDATE() AND tr.isReceived), OR (util.VN_CURDATE() AND tr.isReceived),
'DELIVERED', 'DELIVERED',
'FREE') 'FREE')
WHERE tr.landed >= vDateInventory WHERE IFNULL(tr.availabled, tr.landed) >= vDateInventory
AND tr.warehouseInFk = vWarehouseFk AND tr.warehouseInFk = vWarehouseFk
AND (s.id <> vSupplierInventoryFk OR vDated IS NULL) AND (s.id <> vSupplierInventoryFk OR vDated IS NULL)
AND b.itemFk = vItemFk AND b.itemFk = vItemFk
@ -99,7 +99,7 @@ BEGIN
), ),
sales AS ( sales AS (
WITH itemSales AS ( WITH itemSales AS (
SELECT DATE(t.shipped) shipped, SELECT DATE(t.shipped) + INTERVAL HOUR(z.`hour`) HOUR shipped,
s.quantity, s.quantity,
st2.alertLevel, st2.alertLevel,
st2.name, st2.name,
@ -114,6 +114,7 @@ BEGIN
cb.claimFk cb.claimFk
FROM vn.sale s FROM vn.sale s
JOIN vn.ticket t ON t.id = s.ticketFk JOIN vn.ticket t ON t.id = s.ticketFk
JOIN vn.`zone` z ON z.id = t.zoneFk
LEFT JOIN vn.ticketState ts ON ts.ticketFk = t.id LEFT JOIN vn.ticketState ts ON ts.ticketFk = t.id
LEFT JOIN vn.state st ON st.code = ts.code LEFT JOIN vn.state st ON st.code = ts.code
JOIN vn.client c ON c.id = t.clientFk JOIN vn.client c ON c.id = t.clientFk
@ -189,14 +190,15 @@ BEGIN
SELECT * FROM sales SELECT * FROM sales
UNION ALL UNION ALL
SELECT * FROM orders SELECT * FROM orders
ORDER BY shipped, ORDER BY DATE(shipped),
(inventorySupplierFk = entityId) DESC, (inventorySupplierFk = entityId) DESC,
alertLevel DESC, alertLevel DESC,
isTicket, isTicket,
`order` DESC, `order` DESC,
isPicked DESC, isPicked DESC,
`in` DESC, `in` DESC,
`out` DESC; `out` DESC,
shipped;
IF vDated IS NULL THEN IF vDated IS NULL THEN
SET @a := 0; SET @a := 0;
@ -205,7 +207,7 @@ BEGIN
SELECT t.originType, SELECT t.originType,
t.originId, t.originId,
DATE(@shipped:= t.shipped) shipped, @shipped:= t.shipped,
t.alertLevel, t.alertLevel,
t.stateName, t.stateName,
t.reference, t.reference,

View File

@ -1,9 +1,20 @@
DELIMITER $$ DELIMITER $$
CREATE OR REPLACE DEFINER=`vn`@`localhost` PROCEDURE `vn`.`item_getLack`(IN vForce BOOLEAN, IN vDays INT) CREATE OR REPLACE DEFINER=`vn`@`localhost` PROCEDURE `vn`.`item_getLack`(
vSelf INT,
vForce BOOLEAN,
vDays INT,
vLongname VARCHAR(255),
vProducerName VARCHAR(255),
vColor VARCHAR(255),
vSize INT,
vOrigen INT,
vLack INT,
vWarehouseFk INT
)
BEGIN BEGIN
/** /**
* Calcula una tabla con el máximo negativo visible para cada producto y almacen * Calcula una tabla con el máximo negativo visible para cada producto y almacen
* *
* @param vForce Fuerza el recalculo del stock * @param vForce Fuerza el recalculo del stock
* @param vDays Numero de dias a considerar * @param vDays Numero de dias a considerar
**/ **/
@ -13,33 +24,33 @@ BEGIN
CALL item_getMinETD(); CALL item_getMinETD();
CALL item_zoneClosure(); CALL item_zoneClosure();
SELECT i.id itemFk, SELECT i.id itemFk,
i.longName, i.longName,
w.id warehouseFk, w.id warehouseFk,
p.`name` producer, p.`name` producer,
i.`size`, i.`size`,
i.category, i.category,
w.name warehouse, w.name warehouse,
SUM(IFNULL(sub.amount,0)) lack, SUM(IFNULL(sub.amount,0)) lack,
i.inkFk, i.inkFk,
IFNULL(im.timed, util.midnight()) timed, IFNULL(im.timed, util.midnight()) timed,
IFNULL(izc.timed, util.midnight()) minTimed, IFNULL(izc.timed, util.midnight()) minTimed,
o.name originFk o.name originFk
FROM (SELECT item_id, FROM (SELECT item_id,
warehouse_id, warehouse_id,
amount amount
FROM cache.stock FROM cache.stock
WHERE amount > 0 WHERE amount > 0
UNION ALL UNION ALL
SELECT itemFk, SELECT itemFk,
warehouseFk, warehouseFk,
amount amount
FROM tmp.itemMinacum FROM tmp.itemMinacum
) sub ) sub
JOIN warehouse w ON w.id = sub.warehouse_id JOIN warehouse w ON w.id = sub.warehouse_id
JOIN item i ON i.id = sub.item_id JOIN item i ON i.id = sub.item_id
LEFT JOIN producer p ON p.id = i.producerFk LEFT JOIN producer p ON p.id = i.producerFk
JOIN itemType it ON it.id = i.typeFk JOIN itemType it ON it.id = i.typeFk
JOIN itemCategory ic ON ic.id = it.categoryFk JOIN itemCategory ic ON ic.id = it.categoryFk
LEFT JOIN tmp.itemMinETD im ON im.itemFk = i.id LEFT JOIN tmp.itemMinETD im ON im.itemFk = i.id
LEFT JOIN tmp.itemZoneClosure izc ON izc.itemFk = i.id LEFT JOIN tmp.itemZoneClosure izc ON izc.itemFk = i.id
@ -47,6 +58,14 @@ BEGIN
WHERE w.isForTicket WHERE w.isForTicket
AND ic.display AND ic.display
AND it.code != 'GEN' AND it.code != 'GEN'
AND (vSelf IS NULL OR i.id = vSelf)
AND (vLongname IS NULL OR i.name = vLongname)
AND (vProducerName IS NULL OR p.`name` LIKE CONCAT('%', vProducerName, '%'))
AND (vColor IS NULL OR vColor = i.inkFk)
AND (vSize IS NULL OR vSize = i.`size`)
AND (vOrigen IS NULL OR vOrigen = w.id)
AND (vLack IS NULL OR vLack = sub.amount)
AND (vWarehouseFk IS NULL OR vWarehouseFk = w.id)
GROUP BY i.id, w.id GROUP BY i.id, w.id
HAVING lack < 0; HAVING lack < 0;

View File

@ -82,21 +82,26 @@ BEGIN
AND it.priority = vPriority AND it.priority = vPriority
LEFT JOIN vn.tag t ON t.id = it.tagFk LEFT JOIN vn.tag t ON t.id = it.tagFk
LEFT JOIN vn.buy b ON b.id = bu.buyFk LEFT JOIN vn.buy b ON b.id = bu.buyFk
LEFT JOIN vn.itemShelvingStock iss ON iss.itemFk = i.id
AND iss.warehouseFk = vWarehouseFk
LEFT JOIN vn.ink ink ON ink.id = i.tag5
JOIN itemTags its JOIN itemTags its
WHERE a.available > 0 WHERE a.available > 0
AND (i.typeFk = its.typeFk OR NOT vShowType) AND (i.typeFk = its.typeFk OR NOT vShowType)
AND i.id <> vSelf AND i.id <> vSelf
ORDER BY `counter` DESC, ORDER BY (a.available > 0) DESC,
(t.name = its.name) DESC, `counter` DESC,
(it.value = its.value) DESC, (t.name = its.name) DESC,
(i.tag5 = its.tag5) DESC, (it.value = its.value) DESC,
match5 DESC, (i.tag5 = its.tag5) DESC,
(i.tag6 = its.tag6) DESC, (ink.`showOrder`) DESC,
match6 DESC, match5 DESC,
(i.tag7 = its.tag7) DESC, (i.tag6 = its.tag6) DESC,
match7 DESC, match6 DESC,
(i.tag8 = its.tag8) DESC, (i.tag7 = its.tag7) DESC,
match8 DESC match7 DESC,
(i.tag8 = its.tag8) DESC,
match8 DESC
LIMIT 100; LIMIT 100;
DROP TEMPORARY TABLE tmp.buyUltimate; DROP TEMPORARY TABLE tmp.buyUltimate;

View File

@ -35,8 +35,8 @@ BEGIN
SELECT iei.itemFk, iei.quantity SELECT iei.itemFk, iei.quantity
FROM itemEntryIn iei FROM itemEntryIn iei
JOIN item i ON i.id = iei.itemFk JOIN item i ON i.id = iei.itemFk
WHERE iei.landed >= util.VN_CURDATE() WHERE IFNULL(iei.availabled, iei.landed) >= util.VN_CURDATE()
AND iei.landed < vDated AND IFNULL(iei.availabled, iei.landed) < vDated
AND iei.warehouseInFk = vWarehouseFk AND iei.warehouseInFk = vWarehouseFk
AND (vItemFk IS NULL OR iei.itemFk = vItemFk) AND (vItemFk IS NULL OR iei.itemFk = vItemFk)
UNION ALL UNION ALL

View File

@ -91,6 +91,7 @@ proc: BEGIN
pk.code parking, pk.code parking,
0 H, 0 H,
0 V, 0 V,
0 A,
0 N, 0 N,
st.isOk, st.isOk,
ag.isOwn, ag.isOwn,
@ -138,6 +139,7 @@ proc: BEGIN
CHANGE COLUMN `problem` `problem` VARCHAR(255), CHANGE COLUMN `problem` `problem` VARCHAR(255),
ADD COLUMN `collectionH` INT, ADD COLUMN `collectionH` INT,
ADD COLUMN `collectionV` INT, ADD COLUMN `collectionV` INT,
ADD COLUMN `collectionA` INT,
ADD COLUMN `collectionN` INT; ADD COLUMN `collectionN` INT;
-- Clientes Nuevos o Recuperados -- Clientes Nuevos o Recuperados
@ -178,12 +180,14 @@ proc: BEGIN
ENGINE = MEMORY ENGINE = MEMORY
SELECT ticketFk, SELECT ticketFk,
SUM(sub.H) H, SUM(sub.H) H,
SUM(sub.V) V, SUM(sub.V) V,
SUM(sub.A) A,
SUM(sub.N) N SUM(sub.N) N
FROM ( FROM (
SELECT t.ticketFk, SELECT t.ticketFk,
SUM(i.itemPackingTypeFk = 'H') H, SUM(i.itemPackingTypeFk = 'H') H,
SUM(i.itemPackingTypeFk = 'V') V, SUM(i.itemPackingTypeFk = 'V') V,
SUM(i.itemPackingTypeFk = 'A') A,
SUM(i.itemPackingTypeFk IS NULL) N SUM(i.itemPackingTypeFk IS NULL) N
FROM tmp.productionTicket t FROM tmp.productionTicket t
JOIN sale s ON s.ticketFk = t.ticketFk JOIN sale s ON s.ticketFk = t.ticketFk
@ -196,6 +200,7 @@ proc: BEGIN
JOIN tItemPackingType ti ON ti.ticketFk = pb.ticketFk JOIN tItemPackingType ti ON ti.ticketFk = pb.ticketFk
SET pb.H = ti.H, SET pb.H = ti.H,
pb.V = ti.V, pb.V = ti.V,
pb.A = ti.A,
pb.N = ti.N; pb.N = ti.N;
-- Colecciones segun tipo de encajado -- Colecciones segun tipo de encajado
@ -203,6 +208,7 @@ proc: BEGIN
JOIN ticketCollection tc ON pb.ticketFk = tc.ticketFk JOIN ticketCollection tc ON pb.ticketFk = tc.ticketFk
SET pb.collectionH = IF(pb.H, tc.collectionFk, NULL), SET pb.collectionH = IF(pb.H, tc.collectionFk, NULL),
pb.collectionV = IF(pb.V, tc.collectionFk, NULL), pb.collectionV = IF(pb.V, tc.collectionFk, NULL),
pb.collectionA = IF(pb.A, tc.collectionFk, NULL),
pb.collectionN = IF(pb.N, tc.collectionFk, NULL); pb.collectionN = IF(pb.N, tc.collectionFk, NULL);
-- Previa pendiente -- Previa pendiente

View File

@ -25,9 +25,11 @@ BEGIN
DECLARE vNewSaleFk INT; DECLARE vNewSaleFk INT;
DECLARE vFinalPrice DECIMAL(10,2); DECLARE vFinalPrice DECIMAL(10,2);
DECLARE vIsRequiredTx BOOL DEFAULT NOT @@in_transaction;
DECLARE EXIT HANDLER FOR SQLEXCEPTION DECLARE EXIT HANDLER FOR SQLEXCEPTION
BEGIN BEGIN
ROLLBACK; CALL util.tx_rollback(vIsRequiredTx);
RESIGNAL; RESIGNAL;
END; END;
@ -62,7 +64,7 @@ BEGIN
WHERE tmp.itemFk = vNewItemFk AND tmp.WarehouseFk = vWarehouseFk; WHERE tmp.itemFk = vNewItemFk AND tmp.WarehouseFk = vWarehouseFk;
DROP TEMPORARY TABLE tmp.buyUltimate; DROP TEMPORARY TABLE tmp.buyUltimate;
IF vGroupingMode = 'packing' AND vPacking > 0 THEN IF vGroupingMode = 'packing' AND vPacking > 0 THEN
SET vRoundQuantity = vPacking; SET vRoundQuantity = vPacking;
END IF; END IF;
@ -129,6 +131,6 @@ BEGIN
VALUES(vItemFk, vNewItemFk, 1) VALUES(vItemFk, vNewItemFk, 1)
ON DUPLICATE KEY UPDATE counter = counter + 1; ON DUPLICATE KEY UPDATE counter = counter + 1;
COMMIT; CALL util.tx_commit(vIsRequiredTx);
END$$ END$$
DELIMITER ; DELIMITER ;

View File

@ -9,15 +9,12 @@ BEGIN
*/ */
DECLARE vDone BOOL; DECLARE vDone BOOL;
DECLARE vClientFk INT; DECLARE vClientFk INT;
DECLARE vCurTicketFk INT; DECLARE vCurTicketFk INT;
DECLARE vIsTaxDataChecked BOOL;
DECLARE vCompanyFk INT;
DECLARE vShipped DATE;
DECLARE vNewInvoiceId INT; DECLARE vNewInvoiceId INT;
DECLARE vHasDailyInvoice BOOL; DECLARE vHasDailyInvoice BOOL;
DECLARE vWithPackage BOOL; DECLARE vWithPackage BOOL;
DECLARE vHasToInvoice BOOL; DECLARE vHasToInvoice BOOL;
DECLARE vSerial VARCHAR(2); DECLARE vStateCode VARCHAR(45);
DECLARE cur CURSOR FOR DECLARE cur CURSOR FOR
SELECT ticketFk FROM tmp.ticket_close; SELECT ticketFk FROM tmp.ticket_close;
@ -38,18 +35,11 @@ BEGIN
LEAVE proc; LEAVE proc;
END IF; END IF;
SELECT SELECT c.id,
c.id,
c.isTaxDataChecked,
t.companyFk,
t.shipped,
c.hasDailyInvoice, c.hasDailyInvoice,
w.isManaged, w.isManaged,
c.hasToInvoice c.hasToInvoice
INTO vClientFk, INTO vClientFk,
vIsTaxDataChecked,
vCompanyFk,
vShipped,
vHasDailyInvoice, vHasDailyInvoice,
vWithPackage, vWithPackage,
vHasToInvoice vHasToInvoice
@ -59,7 +49,7 @@ BEGIN
WHERE t.id = vCurTicketFk; WHERE t.id = vCurTicketFk;
INSERT INTO ticketPackaging (ticketFk, packagingFk, quantity) INSERT INTO ticketPackaging (ticketFk, packagingFk, quantity)
(SELECT vCurTicketFk, p.id, COUNT(*) SELECT vCurTicketFk, p.id, COUNT(*)
FROM expedition e FROM expedition e
JOIN packaging p ON p.id = e.packagingFk JOIN packaging p ON p.id = e.packagingFk
JOIN ticket t ON t.id = e.ticketFk JOIN ticket t ON t.id = e.ticketFk
@ -68,39 +58,35 @@ BEGIN
WHERE e.ticketFk = vCurTicketFk AND p.isPackageReturnable WHERE e.ticketFk = vCurTicketFk AND p.isPackageReturnable
AND vWithPackage AND vWithPackage
AND NOT dm.`code`= 'PICKUP' AND NOT dm.`code`= 'PICKUP'
GROUP BY p.itemFk); GROUP BY p.itemFk;
-- No retornables o no catalogados -- No retornables o no catalogados
INSERT INTO sale (itemFk, ticketFk, concept, quantity, price, isPriceFixed) INSERT INTO sale (
(SELECT e.freightItemFk, vCurTicketFk, i.name, COUNT(*) AS amount, getSpecialPrice(e.freightItemFk, vClientFk), 1 itemFk,
ticketFk,
concept,
quantity,
price, isPriceFixed
)SELECT e.freightItemFk,
vCurTicketFk,
i.name,
COUNT(*) amount,
getSpecialPrice(e.freightItemFk, vClientFk),
TRUE
FROM expedition e FROM expedition e
JOIN item i ON i.id = e.freightItemFk JOIN item i ON i.id = e.freightItemFk
LEFT JOIN packaging p ON p.itemFk = i.id LEFT JOIN packaging p ON p.itemFk = i.id
WHERE e.ticketFk = vCurTicketFk AND IFNULL(p.isPackageReturnable, 0) = 0 WHERE e.ticketFk = vCurTicketFk
AND (p.isPackageReturnable = 0 OR p.isPackageReturnable IS NULL)
AND getSpecialPrice(e.freightItemFk, vClientFk) > 0 AND getSpecialPrice(e.freightItemFk, vClientFk) > 0
GROUP BY e.freightItemFk); GROUP BY e.freightItemFk;
IF(vHasDailyInvoice) AND vHasToInvoice THEN IF vHasDailyInvoice AND vHasToInvoice THEN
SELECT invoiceSerial(vClientFk, vCompanyFk, 'quick') INTO vSerial; SET vStateCode = 'DELIVERED';
IF vSerial IS NULL THEN
CALL util.throw('Cannot booking without a serial');
END IF;
CALL ticket_setState(vCurTicketFk, 'DELIVERED');
IF vIsTaxDataChecked THEN
CALL invoiceOut_newFromClient(
vClientFk,
vSerial,
vShipped,
vCompanyFk,
NULL,
NULL,
vNewInvoiceId);
END IF;
ELSE ELSE
CALL ticket_setState(vCurTicketFk, (SELECT vn.getAlert3State(vCurTicketFk))); SELECT getAlert3State(vCurTicketFk) INTO vStateCode;
END IF; END IF;
CALL ticket_setState(vCurTicketFk, vStateCode);
END LOOP; END LOOP;
CLOSE cur; CLOSE cur;

View File

@ -1,5 +1,7 @@
DELIMITER $$ DELIMITER $$
CREATE OR REPLACE DEFINER=`vn`@`localhost` PROCEDURE `vn`.`ticket_doCmr`(vSelf INT) CREATE OR REPLACE DEFINER=`vn`@`localhost` PROCEDURE `vn`.`ticket_doCmr`(
vSelf INT
)
BEGIN BEGIN
/** /**
* Crea u actualiza la información del CMR asociado con * Crea u actualiza la información del CMR asociado con
@ -19,6 +21,7 @@ BEGIN
a.id addressFk, a.id addressFk,
c2.defaultAddressFk, c2.defaultAddressFk,
IFNULL(sat.supplierFk, su.id) supplierFk, IFNULL(sat.supplierFk, su.id) supplierFk,
t.packages,
t.landed t.landed
FROM ticket t FROM ticket t
JOIN client c ON c.id = t.clientFk JOIN client c ON c.id = t.clientFk
@ -52,9 +55,10 @@ BEGIN
c.addressToFk = t.addressFk, c.addressToFk = t.addressFk,
c.addressFromFk = t.defaultAddressFk, c.addressFromFk = t.defaultAddressFk,
c.supplierFk = t.supplierFk, c.supplierFk = t.supplierFk,
c.packagesList = t.packages,
c.ead = t.landed c.ead = t.landed
WHERE id = vCmrFk; WHERE id = vCmrFk;
ELSE ELSE
INSERT INTO cmr ( INSERT INTO cmr (
senderInstruccions, senderInstruccions,
truckPlate, truckPlate,
@ -62,6 +66,7 @@ BEGIN
addressToFk, addressToFk,
addressFromFk, addressFromFk,
supplierFk, supplierFk,
packagesList,
ead ead
) )
SELECT * FROM tTicket; SELECT * FROM tTicket;

View File

@ -3,21 +3,25 @@ CREATE OR REPLACE DEFINER=`vn`@`localhost` PROCEDURE `vn`.`ticket_setState`(
vSelf INT, vSelf INT,
vStateCode VARCHAR(255) COLLATE utf8_general_ci vStateCode VARCHAR(255) COLLATE utf8_general_ci
) )
BEGIN proc:BEGIN
/** /**
* Modifica el estado de un ticket si se cumplen las condiciones necesarias. * Modifica el estado de un ticket si se cumplen las condiciones necesarias.
* *
* @param vSelf el id del ticket * @param vSelf el id del ticket
* @param vStateCode estado a modificar del ticket * @param vStateCode estado a modificar del ticket
*/ */
DECLARE vticketAlertLevel INT; DECLARE vTicketAlertLevel INT;
DECLARE vTicketStateCode VARCHAR(255); DECLARE vTicketStateCode VARCHAR(255) COLLATE utf8_general_ci;
DECLARE vCanChangeState BOOL; DECLARE vCanChangeState BOOL;
DECLARE vPackedAlertLevel INT; DECLARE vPackedAlertLevel INT;
DECLARE vZoneFk INT; DECLARE vZoneFk INT;
DECLARE vOldWorkerFk INT;
DECLARE vNewWorkerFk INT;
SELECT s.alertLevel, s.`code`, t.zoneFk SET vNewWorkerFk = account.myUser_getId();
INTO vticketAlertLevel, vTicketStateCode, vZoneFk
SELECT s.alertLevel, s.`code`, t.zoneFk, tt.userFk
INTO vTicketAlertLevel, vTicketStateCode, vZoneFk, vOldWorkerFk
FROM state s FROM state s
JOIN ticketTracking tt ON tt.stateFk = s.id JOIN ticketTracking tt ON tt.stateFk = s.id
JOIN ticket t ON t.id = tt.ticketFk JOIN ticket t ON t.id = tt.ticketFk
@ -33,24 +37,27 @@ BEGIN
SET vCanChangeState = (( SET vCanChangeState = ((
vStateCode <> 'ON_CHECKING' AND vStateCode <> 'CHECKED') OR vStateCode <> 'ON_CHECKING' AND vStateCode <> 'CHECKED') OR
vticketAlertLevel < vPackedAlertLevel vTicketAlertLevel < vPackedAlertLevel
)AND NOT ( ) AND NOT (
vTicketStateCode IN ('CHECKED', 'CHECKING') vTicketStateCode IN ('CHECKED', 'CHECKING')
AND vStateCode IN ('PREPARED', 'ON_PREPARATION') AND vStateCode IN ('PREPARED', 'ON_PREPARATION')
); );
IF vCanChangeState THEN IF vCanChangeState THEN
INSERT INTO ticketTracking (stateFk, ticketFk, userFk)
SELECT id, vSelf, account.myUser_getId()
FROM state
WHERE `code` = vStateCode COLLATE utf8_unicode_ci;
IF vStateCode = 'PACKED' THEN IF vStateCode = 'PACKED' THEN
CALL ticket_doCmr(vSelf); CALL ticket_doCmr(vSelf);
END IF; END IF;
IF vStateCode = vTicketStateCode AND vOldWorkerFk = vNewWorkerFk THEN
LEAVE proc;
END IF;
INSERT INTO ticketTracking (stateFk, ticketFk, userFk)
SELECT id, vSelf, vNewWorkerFk
FROM state
WHERE `code` = vStateCode COLLATE utf8_unicode_ci;
ELSE ELSE
CALL util.throw('INCORRECT_TICKET_STATE'); CALL util.throw('INCORRECT_TICKET_STATE');
END IF; END IF;
END$$ END$$
DELIMITER ; DELIMITER ;

View File

@ -1,26 +0,0 @@
DELIMITER $$
CREATE OR REPLACE DEFINER=`vn`@`localhost` TRIGGER `vn`.`roadmapStop_beforeDelete`
BEFORE DELETE ON `roadmapStop`
FOR EACH ROW
BEGIN
DECLARE vMaxEta DATETIME;
DECLARE vRoadmapEta DATETIME;
IF OLD.roadmapFk IS NOT NULL THEN
SELECT MAX(eta) INTO vMaxEta
FROM roadmapStop
WHERE roadmapFk = OLD.roadmapFk
AND id <> OLD.id;
SELECT eta INTO vRoadmapEta
FROM roadmap
WHERE id = OLD.roadmapFk;
IF vMaxEta <> vRoadmapEta OR vMaxEta IS NULL THEN
UPDATE roadmap
SET eta = vMaxEta
WHERE id = OLD.roadmapFk;
END IF;
END IF;
END$$
DELIMITER ;

View File

@ -3,8 +3,6 @@ CREATE OR REPLACE DEFINER=`vn`@`localhost` TRIGGER `vn`.`roadmapStop_beforeInser
BEFORE INSERT ON `roadmapStop` BEFORE INSERT ON `roadmapStop`
FOR EACH ROW FOR EACH ROW
BEGIN BEGIN
DECLARE vRoadmapEta DATETIME;
SET NEW.editorFk = account.myUser_getId(); SET NEW.editorFk = account.myUser_getId();
IF NEW.description IS NOT NULL THEN IF NEW.description IS NOT NULL THEN
@ -16,17 +14,5 @@ BEGIN
CALL util.throw('Departure time can not be after arrival time'); CALL util.throw('Departure time can not be after arrival time');
END IF; END IF;
END IF; END IF;
IF NEW.roadmapFk IS NOT NULL AND NEW.eta IS NOT NULL THEN
SELECT eta INTO vRoadmapEta
FROM roadmap
WHERE id = NEW.roadmapFk;
IF vRoadmapEta < NEW.eta OR vRoadmapEta IS NULL THEN
UPDATE roadmap
SET eta = NEW.eta
WHERE id = NEW.roadmapFk;
END IF;
END IF;
END$$ END$$
DELIMITER ; DELIMITER ;

View File

@ -3,40 +3,17 @@ CREATE OR REPLACE DEFINER=`vn`@`localhost` TRIGGER `vn`.`roadmapStop_beforeUpdat
BEFORE UPDATE ON `roadmapStop` BEFORE UPDATE ON `roadmapStop`
FOR EACH ROW FOR EACH ROW
BEGIN BEGIN
DECLARE vMaxEta DATETIME;
DECLARE vCurrentEta DATETIME;
SET NEW.editorFk = account.myUser_getId(); SET NEW.editorFk = account.myUser_getId();
IF NOT (NEW.description <=> OLD.description) THEN IF NOT (NEW.description <=> OLD.description) THEN
SET NEW.description = UCASE(NEW.description); SET NEW.description = UCASE(NEW.description);
END IF; END IF;
IF (NOT (NEW.roadmapFk <=> OLD.roadmapFk) AND NEW.roadmapFk IS NOT NULL) IF NOT (NEW.roadmapFk <=> OLD.roadmapFk) OR NOT (NEW.eta <=> OLD.eta) THEN
OR (NOT (NEW.eta <=> OLD.eta)) THEN
IF NEW.eta < (SELECT etd FROM roadmap WHERE id = NEW.roadmapFk) THEN IF NEW.eta < (SELECT etd FROM roadmap WHERE id = NEW.roadmapFk) THEN
CALL util.throw('Departure time can not be after arrival time'); CALL util.throw('Departure time can not be after arrival time');
END IF; END IF;
SELECT MAX(eta) INTO vMaxEta
FROM roadmapStop
WHERE roadmapFk = NEW.roadmapFk
AND id <> OLD.id;
IF vMaxEta < NEW.eta OR vMaxEta IS NULL THEN
SET vMaxEta = NEW.eta;
END IF;
SELECT eta INTO vCurrentEta
FROM roadmap
WHERE id = NEW.roadmapFk;
IF (vMaxEta <> vCurrentEta OR vMaxEta IS NULL) OR vMaxEta IS NOT NULL THEN
UPDATE roadmap
SET eta = vMaxEta
WHERE id = NEW.roadmapFk;
END IF;
END IF; END IF;
END$$ END$$
DELIMITER ; DELIMITER ;

View File

@ -1,17 +0,0 @@
DELIMITER $$
CREATE OR REPLACE DEFINER=`vn`@`localhost` TRIGGER `vn`.`roadmap_afterUpdate`
AFTER UPDATE ON `roadmap`
FOR EACH ROW
BEGIN
DECLARE vSeconds INT;
IF NOT (NEW.etd <=> OLD.etd) THEN
SET vSeconds = TIME_TO_SEC(TIMEDIFF(NEW.etd, OLD.etd));
IF vSeconds IS NOT NULL AND vSeconds <> 0 THEN
UPDATE roadmapStop
SET eta = eta + INTERVAL vSeconds SECOND
WHERE roadmapFk = NEW.id;
END IF;
END IF;
END$$
DELIMITER ;

View File

@ -3,6 +3,8 @@ CREATE OR REPLACE DEFINER=`vn`@`localhost` TRIGGER `vn`.`roadmap_beforeUpdate`
BEFORE UPDATE ON `roadmap` BEFORE UPDATE ON `roadmap`
FOR EACH ROW FOR EACH ROW
BEGIN BEGIN
DECLARE vSeconds INT;
SET NEW.editorFk = account.myUser_getId(); SET NEW.editorFk = account.myUser_getId();
IF NOT (NEW.name <=> OLD.name) THEN IF NOT (NEW.name <=> OLD.name) THEN
@ -37,5 +39,15 @@ BEGIN
FROM worker w FROM worker w
WHERE w.id = NEW.driverChangeFk); WHERE w.id = NEW.driverChangeFk);
END IF; END IF;
IF NOT (NEW.etd <=> OLD.etd) THEN
SET vSeconds = TIME_TO_SEC(TIMEDIFF(NEW.etd, OLD.etd));
IF vSeconds <> 0 THEN
UPDATE roadmapStop
SET eta = eta + INTERVAL vSeconds SECOND
WHERE roadmapFk = NEW.id;
END IF;
END IF;
END$$ END$$
DELIMITER ; DELIMITER ;

View File

@ -0,0 +1,8 @@
CREATE OR REPLACE DEFINER=`vn`@`localhost`
SQL SECURITY DEFINER
VIEW `vn`.`roadmapEta`
AS SELECT `roadmapFk` AS id,
MAX(`eta`) AS `eta`
FROM `vn`.`roadmapStop`
WHERE `roadmapFk` IS NOT NULL
GROUP BY `roadmapFk`;

View File

@ -0,0 +1,6 @@
INSERT IGNORE INTO salix.ACL (model,property,accessType,permission,principalType,principalId)
VALUES
('Ticket','itemLack','READ','ALLOW','ROLE','employee'),
('Ticket','itemLackDetail','READ','ALLOW','ROLE','employee'),
('Ticket','split','WRITE','ALLOW','ROLE','employee'),
('Sale','replaceItem','WRITE','ALLOW','ROLE','employee');

View File

@ -0,0 +1,2 @@
ALTER TABLE vn.ticketConfig ADD lackAlertPrice int(11) DEFAULT 30 NOT NULL COMMENT 'Value to alert when item proposal exceed price';
ALTER TABLE vn.ticketConfig ADD lackScopeDays int(11) DEFAULT 2 NOT NULL COMMENT 'Number of days to look back for ticket with negatives';

View File

@ -0,0 +1,90 @@
INSERT INTO salix.ACL (model,property,accessType,permission,principalType,principalId)
VALUES ('Entry','getBuyList','READ','ALLOW','ROLE','buyer'),
('Entry','getBuyUltimate','READ','ALLOW','ROLE','buyer'),
('Entry','search','READ','ALLOW','ROLE','buyer'),
('Entry','create','WRITE','ALLOW','ROLE','buyer'),
('Entry','cloneEntry','WRITE','ALLOW','ROLE','buyer'),
('Entry','deleteEntry','WRITE','ALLOW','ROLE','buyer'),
('Entry','recalcEntryPrices','WRITE','ALLOW','ROLE','buyer'),
('EntryType','find','READ','ALLOW','ROLE','buyer'),
('EntryConfig','findOne','READ','ALLOW','ROLE','buyer');
ALTER TABLE vn.ink ADD IF NOT EXISTS hexJson TEXT NOT NULL;
UPDATE vn.ink
SET hexJson = CONCAT('{"value": ["',hex,'"]}');
UPDATE vn.ink
SET hexJson = CASE `name`
WHEN 'Blanco/Naranja' THEN '{"value": ["FFFFFF", "FFA500"]}'
WHEN 'Sin especificar' THEN '{"value": ["808080"]}'
WHEN '2 Colores' THEN '{"value": ["000000", "FFFFFF"]}'
WHEN 'Amarillo/Marrón' THEN '{"value": ["FFFF00", "8B4513"]}'
WHEN 'Amarillo/Naranja' THEN '{"value": ["FFFF00", "FFA500"]}'
WHEN 'Rosa/Blanco/Amarillo' THEN '{"value": ["FFC0CB", "FFFFFF", "FFFF00"]}'
WHEN 'Rosa/Amarillo' THEN '{"value": ["FFC0CB", "FFFF00"]}'
WHEN 'Antracita' THEN '{"value": ["2F2F2F"]}'
WHEN 'Azul/Amarillo' THEN '{"value": ["0000FF", "FFFF00"]}'
WHEN 'Azul Claro' THEN '{"value": ["ADD8E6"]}'
WHEN 'Azul/Marron' THEN '{"value": ["0000FF", "8B4513"]}'
WHEN 'Azul/Verde' THEN '{"value": ["0000FF", "008000"]}'
WHEN 'Blanco/Amarillo' THEN '{"value": ["FFFFFF", "FFFF00"]}'
WHEN 'Blaugrana' THEN '{"value": ["A50044", "004D98"]}'
WHEN 'Blanco/Negro' THEN '{"value": ["FFFFFF", "000000"]}'
WHEN 'Blanco/Verde' THEN '{"value": ["FFFFFF", "008000"]}'
WHEN 'Blanco/Azul' THEN '{"value": ["FFFFFF", "0000FF"]}'
WHEN 'Blanco/Rosa' THEN '{"value": ["FFFFFF", "FFC0CB"]}'
WHEN 'Cognac/Verde' THEN '{"value": ["9A463D", "008000"]}'
WHEN 'Champagne/Verde' THEN '{"value": ["F7E7CE", "008000"]}'
WHEN 'Camuflaje' THEN '{"value": ["6B8E23", "556B2F", "8B4513"]}'
WHEN 'Crema/Rosa' THEN '{"value": ["FFFDD0", "FFC0CB"]}'
WHEN 'Fucsia/Amarillo' THEN '{"value": ["FF00FF", "FFFF00"]}'
WHEN 'Fucsia/Blanco' THEN '{"value": ["FF00FF", "FFFFFF"]}'
WHEN 'Fucsia/Crema' THEN '{"value": ["FF00FF", "FFFDD0"]}'
WHEN 'Fucsia/Rosa' THEN '{"value": ["FF00FF", "FFC0CB"]}'
WHEN 'Fucsia/Verde' THEN '{"value": ["FF00FF", "008000"]}'
WHEN 'Granate/Blanco' THEN '{"value": ["800000", "FFFFFF"]}'
WHEN 'Gris Lila' THEN '{"value": ["808080", "C8A2C8"]}'
WHEN 'Lavanda/Amarillo' THEN '{"value": ["E6E6FA", "FFFF00"]}'
WHEN 'Lavanda/Gris' THEN '{"value": ["E6E6FA", "808080"]}'
WHEN 'Lividum' THEN '{"value": ["702963"]}'
WHEN 'Morado/Amarillo' THEN '{"value": ["800080", "FFFF00"]}'
WHEN 'Marrón/Blanco' THEN '{"value": ["8B4513", "FFFFFF"]}'
WHEN 'Marron/Gris' THEN '{"value": ["8B4513", "808080"]}'
WHEN 'Marron/Negro' THEN '{"value": ["8B4513", "000000"]}'
WHEN 'Marrón/Verde' THEN '{"value": ["8B4513", "008000"]}'
WHEN 'Matizado' THEN '{"value": ["D3D3D3", "808080", "FFFFFF"]}'
WHEN 'Mixto' THEN '{"value": ["FF0000", "0000FF", "008000", "FFFF00"]}'
WHEN 'Marrón Oscuro' THEN '{"value": ["654321"]}'
WHEN 'Naranja/Marron' THEN '{"value": ["FFA500", "8B4513"]}'
WHEN 'Naranja/Rosa' THEN '{"value": ["FFA500", "FFC0CB"]}'
WHEN 'Ocre/Burgundi' THEN '{"value": ["CC7722", "800020"]}'
WHEN 'Oro/Plata' THEN '{"value": ["FFD700", "C0C0C0"]}'
WHEN 'Oro/Negro' THEN '{"value": ["FFD700", "000000"]}'
WHEN 'Oro/Verde' THEN '{"value": ["FFD700", "008000"]}'
WHEN 'Purpura/Blanco' THEN '{"value": ["800080", "FFFFFF"]}'
WHEN 'Purpura/Rosa' THEN '{"value": ["800080", "FFC0CB"]}'
WHEN 'Pastel' THEN '{"value": ["FFB6C1", "87CEFA", "98FB98"]}'
WHEN 'Plata' THEN '{"value": ["C0C0C0"]}'
WHEN 'Plata/Verde' THEN '{"value": ["C0C0C0", "008000"]}'
WHEN 'Rojo/Amarillo' THEN '{"value": ["FF0000", "FFFF00"]}'
WHEN 'Rojo/Blanco' THEN '{"value": ["FF0000", "FFFFFF"]}'
WHEN 'Rojo/Naranja' THEN '{"value": ["FF0000", "FFA500"]}'
WHEN 'Rojo/Oro' THEN '{"value": ["FF0000", "FFD700"]}'
WHEN 'Rojo/Verde' THEN '{"value": ["FF0000", "008000"]}'
WHEN 'Rosa/Lila' THEN '{"value": ["FFC0CB", "C8A2C8"]}'
WHEN 'Rosa/Naranja' THEN '{"value": ["FFC0CB", "FFA500"]}'
WHEN 'Rojo/Rosa' THEN '{"value": ["FF0000", "FFC0CB"]}'
WHEN 'Rosa empolvado' THEN '{"value": ["E6B8AF"]}'
WHEN 'Rosa/Verde' THEN '{"value": ["FFC0CB", "008000"]}'
WHEN 'Topo/Blanco' THEN '{"value": ["8B8589", "FFFFFF"]}'
WHEN 'Topo' THEN '{"value": ["8B8589"]}'
WHEN 'Transparente' THEN '{"value": ["00000000"]}'
WHEN 'Verde/Amarillo' THEN '{"value": ["008000", "FFFF00"]}'
WHEN 'Verde/Negro' THEN '{"value": ["008000", "000000"]}'
WHEN 'Variado' THEN '{"value": ["FF0000", "0000FF", "008000", "FFFF00", "FFA500"]}'
WHEN 'Verde Claro/Morado' THEN '{"value": ["90EE90", "800080"]}'
WHEN 'Verde/Lila' THEN '{"value": ["008000", "C8A2C8"]}'
WHEN 'Vaquero Neon' THEN '{"value": ["1560BD", "FFFF00"]}'
ELSE hexJson
END;

View File

@ -0,0 +1,3 @@
ALTER TABLE vn.roadmap
MODIFY COLUMN dollyPlate varchar(10) CHARACTER SET utf8mb3 COLLATE utf8mb3_unicode_ci DEFAULT NULL NULL COMMENT
'Vehículo sin motor diseñado para conectarse a una unidad tractora, un camión o un vehículo tractor con fuerte potencia de tracción';

View File

@ -0,0 +1,3 @@
ALTER TABLE vn.volumeConfig ADD COLUMN id INT(11) NOT NULL AUTO_INCREMENT PRIMARY KEY FIRST;
GRANT UPDATE (palletM3) ON vn.volumeConfig TO deliveryBoss;

View File

@ -0,0 +1,3 @@
ALTER TABLE vn.vehicle
MODIFY COLUMN typeFk enum('car','van','truck','trailer','tug','dolly','trailerLink')
CHARACTER SET utf8mb3 COLLATE utf8mb3_unicode_ci DEFAULT 'van' NOT NULL;

View File

@ -0,0 +1 @@
ALTER TABLE vn.roadmap DROP COLUMN eta;

View File

@ -0,0 +1,8 @@
UPDATE vn.expedition e
JOIN (
SELECT id
FROM vn.expedition
WHERE hostFk COLLATE utf8mb3_unicode_ci NOT IN
(SELECT code COLLATE utf8mb3_unicode_ci FROM vn.host WHERE code IS NOT NULL)
) s ON e.id = s.id
SET e.hostFk = 'pc336';

View File

@ -0,0 +1,9 @@
ALTER TABLE vn.expedition
MODIFY COLUMN hostFk VARCHAR(30) COLLATE utf8mb3_general_ci;
ALTER TABLE vn.expedition
ADD CONSTRAINT fk_expedition_host_code
FOREIGN KEY (hostFk)
REFERENCES host(code)
ON UPDATE CASCADE
ON DELETE CASCADE;

View File

@ -234,6 +234,7 @@
"It has been invoiced but the PDF of refund not be generated": "It has been invoiced but the PDF of refund not be generated", "It has been invoiced but the PDF of refund not be generated": "It has been invoiced but the PDF of refund not be generated",
"Cannot add holidays on this day": "Cannot add holidays on this day", "Cannot add holidays on this day": "Cannot add holidays on this day",
"Cannot send mail": "Cannot send mail", "Cannot send mail": "Cannot send mail",
"This worker already exists": "This worker already exists",
"CONSTRAINT `chkParkingCodeFormat` failed for `vn`.`parking`": "CONSTRAINT `chkParkingCodeFormat` failed for `vn`.`parking`", "CONSTRAINT `chkParkingCodeFormat` failed for `vn`.`parking`": "CONSTRAINT `chkParkingCodeFormat` failed for `vn`.`parking`",
"This postcode already exists": "This postcode already exists", "This postcode already exists": "This postcode already exists",
"Original invoice not found": "Original invoice not found", "Original invoice not found": "Original invoice not found",
@ -254,5 +255,7 @@
"Holidays to past days not available": "Holidays to past days not available", "Holidays to past days not available": "Holidays to past days not available",
"Incorrect delivery order alert on route": "Incorrect delivery order alert on route: {{ route }} zone: {{ zone }}", "Incorrect delivery order alert on route": "Incorrect delivery order alert on route: {{ route }} zone: {{ zone }}",
"Ticket has been delivered out of order": "The ticket {{ticket}} of route {{{fullUrl}}} has been delivered out of order.", "Ticket has been delivered out of order": "The ticket {{ticket}} of route {{{fullUrl}}} has been delivered out of order.",
"clonedFromTicketWeekly": ", that is a cloned sale from ticket {{ ticketWeekly }}" "clonedFromTicketWeekly": ", that is a cloned sale from ticket {{ ticketWeekly }}",
"negativeReplaced": "Replaced item [#{{oldItemId}}]({{{oldItemUrl}}}) {{oldItem}} with [#{{newItemId}}]({{{newItemUrl}}}) {{newItem}} from ticket [{{ticketId}}]({{{ticketUrl}}})",
"The tag and priority can't be repeated": "The tag and priority can't be repeated"
} }

View File

@ -22,7 +22,7 @@
"Cannot change the payment method if no salesperson": "No se puede cambiar la forma de pago si no hay comercial asignado", "Cannot change the payment method if no salesperson": "No se puede cambiar la forma de pago si no hay comercial asignado",
"can't be blank": "El campo no puede estar vacío", "can't be blank": "El campo no puede estar vacío",
"Observation type must be unique": "El tipo de observación no puede repetirse", "Observation type must be unique": "El tipo de observación no puede repetirse",
"The credit must be an integer greater than or equal to zero": "The credit must be an integer greater than or equal to zero", "The credit must be an integer greater than or equal to zero": "The credit must be an integer greater than or equal to zero",
"The grade must be similar to the last one": "El grade debe ser similar al último", "The grade must be similar to the last one": "El grade debe ser similar al último",
"Only manager can change the credit": "Solo el gerente puede cambiar el credito de este cliente", "Only manager can change the credit": "Solo el gerente puede cambiar el credito de este cliente",
"Name cannot be blank": "El nombre no puede estar en blanco", "Name cannot be blank": "El nombre no puede estar en blanco",
@ -397,5 +397,6 @@
"Incorrect delivery order alert on route": "Alerta de orden de entrega incorrecta en ruta: {{ route }} zona: {{ zone }}", "Incorrect delivery order alert on route": "Alerta de orden de entrega incorrecta en ruta: {{ route }} zona: {{ zone }}",
"Ticket has been delivered out of order": "El ticket {{ticket}} {{{fullUrl}}} no ha sido entregado en su orden.", "Ticket has been delivered out of order": "El ticket {{ticket}} {{{fullUrl}}} no ha sido entregado en su orden.",
"Price cannot be blank": "El precio no puede estar en blanco", "Price cannot be blank": "El precio no puede estar en blanco",
"clonedFromTicketWeekly": ", que es una linea clonada del ticket {{ticketWeekly}}" "clonedFromTicketWeekly": ", que es una linea clonada del ticket {{ticketWeekly}}",
"negativeReplaced": "Sustituido el articulo [#{{oldItemId}}]({{{oldItemUrl}}}) {{oldItem}} por [#{{newItemId}}]({{{newItemUrl}}}) {{newItem}} del ticket [{{ticketId}}]({{{ticketUrl}}})"
} }

View File

@ -368,5 +368,6 @@
"ticketLostExpedition": "Le ticket [{{ticketId}}]({{{ticketUrl}}}) a l'expédition perdue suivante : {{expeditionId}}", "ticketLostExpedition": "Le ticket [{{ticketId}}]({{{ticketUrl}}}) a l'expédition perdue suivante : {{expeditionId}}",
"The web user's email already exists": "L'email de l'internaute existe déjà", "The web user's email already exists": "L'email de l'internaute existe déjà",
"Incorrect delivery order alert on route": "Alerte de bon de livraison incorrect sur l'itinéraire: {{ route }} zone : {{ zone }}", "Incorrect delivery order alert on route": "Alerte de bon de livraison incorrect sur l'itinéraire: {{ route }} zone : {{ zone }}",
"Ticket has been delivered out of order": "Le ticket {{ticket}} de la route {{{fullUrl}}} a été livré hors service." "Ticket has been delivered out of order": "Le ticket {{ticket}} de la route {{{fullUrl}}} a été livré hors service.",
} "negativeReplaced": "Remplacé l'article [#{{oldItemId}}]({{{oldItemUrl}}}) {{oldItem}} par [#{{newItemId}}]({{{newItemUrl}}}) {{newItem}} du ticket [{{ticketId}}]({{{ticketUrl}}})"
}

View File

@ -367,5 +367,6 @@
"ticketLostExpedition": "O ticket [{{ticketId}}]({{{ticketUrl}}}) tem a seguinte expedição perdida: {{expeditionId}}", "ticketLostExpedition": "O ticket [{{ticketId}}]({{{ticketUrl}}}) tem a seguinte expedição perdida: {{expeditionId}}",
"The web user's email already exists": "O e-mail do utilizador da web já existe.", "The web user's email already exists": "O e-mail do utilizador da web já existe.",
"Incorrect delivery order alert on route": "Alerta de ordem de entrega incorreta na rota: {{ route }} zona: {{ zone }}", "Incorrect delivery order alert on route": "Alerta de ordem de entrega incorreta na rota: {{ route }} zona: {{ zone }}",
"Ticket has been delivered out of order": "O ticket {{ticket}} da rota {{{fullUrl}}} foi entregue fora de ordem." "Ticket has been delivered out of order": "O ticket {{ticket}} da rota {{{fullUrl}}} foi entregue fora de ordem.",
} "negativeReplaced": "Substituído o artigo [#{{oldItemId}}]({{{oldItemUrl}}}) {{oldItem}} por [#{{newItemId}}]({{{newItemUrl}}}) {{newItem}} do ticket [{{ticketId}}]({{{ticketUrl}}})"
}

View File

@ -0,0 +1,303 @@
const ParameterizedSQL = require('loopback-connector').ParameterizedSQL;
const buildFilter = require('vn-loopback/util/filter').buildFilter;
const mergeFilters = require('vn-loopback/util/filter').mergeFilters;
module.exports = Self => {
Self.remoteMethodCtx('getBuyList', {
description: 'Returns buys for editing of one entry',
accessType: 'READ',
accepts: [{
arg: 'entryFk',
type: 'number',
required: true,
description: 'The entry id',
http: {source: 'path'}
},
{
arg: 'filter',
type: 'object',
description: 'Filter defining where, order, offset, and limit - must be a JSON-encoded string'
},
{
arg: 'isIgnored',
type: 'boolean',
description: 'check if the buy is ignored',
http: {source: 'query'}
},
{
arg: 'itemFk',
type: 'number',
description: 'item id',
http: {source: 'query'}
},
{
arg: 'name',
type: 'string',
description: 'item name',
http: {source: 'query'}
},
{
arg: 'size',
type: 'number',
description: 'item size',
http: {source: 'query'}
},
{
arg: 'stickers',
type: 'number',
description: 'sticker quantity',
http: {source: 'query'}
},
{
arg: 'packagingFk',
type: 'number',
description: 'packaging id',
http: {source: 'query'}
},
{
arg: 'weight',
type: 'number',
description: 'weight',
http: {source: 'query'}
},
{
arg: 'packing',
type: 'number',
description: 'packing quantity',
http: {source: 'query'}
},
{
arg: 'grouping',
type: 'number',
description: 'grouping quantity',
http: {source: 'query'}
},
{
arg: 'quantity',
type: 'number',
http: {source: 'query'}
},
{
arg: 'buyingValue',
type: 'number',
http: {source: 'query'}
},
{
arg: 'amount',
type: 'number',
description: 'buying value * quantity',
http: {source: 'query'}
},
{
arg: 'price2',
type: 'number',
description: 'price for the package',
http: {source: 'query'}
},
{
arg: 'price3',
type: 'number',
description: 'price for the box',
http: {source: 'query'}
},
{
arg: 'minPrice',
type: 'number',
description: 'item minimum price',
http: {source: 'query'}
},
{
arg: 'packingOut',
type: 'number',
description: 'quantity of package on a vn box',
http: {source: 'query'}
},
{
arg: 'comment',
type: 'string',
description: 'item comment',
http: {source: 'query'}
},
{
arg: 'subName',
type: 'string',
description: 'supplier name',
http: {source: 'query'}
},
{
arg: 'subName',
type: 'string',
description: 'supplier name',
http: {source: 'query'}
},
{
arg: 'company_name',
type: 'string',
description: 'company name',
http: {source: 'query'}
},
{
arg: 'workerFk',
type: 'number',
description: 'buyer id',
http: {source: 'query'}
},
{
arg: 'itemTypeFk',
type: 'number',
description: 'item family id',
http: {source: 'query'}
},
{
arg: 'groupingMode',
type: 'string',
description: 'grouping mode',
http: {source: 'query'}
},
{
arg: 'hasMinPrice',
type: 'boolean',
description: 'grouping mode',
http: {source: 'query'}
},
{
arg: 'groupBy',
type: 'string',
description: 'group by',
http: {source: 'query'}
},
],
returns: {
type: ['object'],
root: true
},
http: {
path: `/:entryFk/getBuyList`,
verb: 'GET'
}
});
Self.getBuyList = async(ctx, entryFk, filter, options) => {
const myOptions = {};
if (typeof options == 'object')
Object.assign(myOptions, options);
let conn = Self.dataSource.connector;
let where = buildFilter(ctx.args, (param, value) => {
switch (param) {
case 'name':
case 'subName':
case 'company_name':
case 'comment':
return {[param]: {like: `%${value}%`}};
case 'size':
case 'isIgnored':
case 'itemFk':
case 'stickers':
case 'packagingFk':
case 'weight':
case 'packing':
case 'grouping':
case 'quantity':
case 'buyingValue':
case 'amount':
case 'price2':
case 'price3':
case 'packingOut':
case 'minPrice':
case 'workerFk':
case 'itemTypeFk':
case 'groupingMode':
case 'hasMinPrice':
return {[param]: value};
}
});
filter = mergeFilters(filter, {where});
let stmts = [];
let stmt;
const selectFields = `b.id,
b.isIgnored,
b.itemFk,
b.printedStickers,
b.stickers,
b.packagingFk,
b.weight,
b.packing,
b.groupingMode,
b.grouping,
b.quantity,
b.buyingValue,
ROUND(b.buyingValue * b.quantity, 2) amount,
b.isChecked,
b.price2,
b.price3,
i.name,
i.size,
i.minPrice,
i.hasMinPrice,
i.packingOut,
i.comment,
i.subName,
i.tag5,
i.value5,
i.tag6,
i.value6,
i.tag7,
i.value7,
i.tag8,
i.value8,
i.tag9,
i.value9,
i.tag10,
i.value10,
s.company_name,
ik.hexJson,
it.workerFk,
it.id itemTypeFk
`;
const groupByFields = `SUM(b.printedStickers) printedStickers,
SUM(b.packing) packing,
SUM(b.stickers) stickers,
SUM(b.weight) weight,
SUM(b.quantity) quantity,
SUM(ROUND(b.buyingValue * b.quantity, 2)) amount
`;
const groupBy = ctx.args.groupBy;
stmt = new ParameterizedSQL(
`SELECT *
FROM(
SELECT
${ groupBy ? groupByFields : selectFields}
FROM item i
LEFT JOIN ink ik ON ik.id = i.inkFk
LEFT JOIN buy b ON b.itemFk = i.id
LEFT JOIN edi.ekt e ON e.id = b.ektFk
LEFT JOIN edi.supplier s ON e.pro = s.supplier_id
LEFT JOIN itemType it ON it.id = i.typeFk
WHERE b.entryFk = ?
${groupBy ?? ''}
) sub`,
[entryFk]
);
stmt.merge(conn.makeSuffix(filter));
let itemsIndex = stmts.push(stmt) - 1;
let sql = ParameterizedSQL.join(stmts, ';');
let result = await conn.executeStmt(sql, myOptions);
if (groupBy && result.length) {
const buys = await Self.app.models.Buy.find({where: {entryFk}}, myOptions);
const buysChecked = buys.filter(buy => buy?.isChecked);
result[0].isChecked = buysChecked.length === buys.length;
}
return itemsIndex === 0 ? result : result[itemsIndex];
};
};

View File

@ -0,0 +1,46 @@
module.exports = Self => {
Self.remoteMethodCtx('getBuyUltimate', {
description: 'Returns the last buy of the item',
accessType: 'READ',
accepts: [
{
arg: 'itemFk',
type: 'number',
required: true
}, {
arg: 'warehouseFk',
type: 'number',
required: true
}, {
arg: 'date',
type: 'date',
required: true
}
],
returns: {
type: 'object',
root: true
},
http: {
path: `/getBuyUltimate`,
verb: 'GET'
}
});
Self.getBuyUltimate = async(ctx, itemFk, warehouseFk, date, options) => {
const myOptions = {};
if (typeof options == 'object')
Object.assign(myOptions, options);
await Self.rawSql('CALL vn.buy_getUltimate(?, ?, ?)', [itemFk, warehouseFk, date], myOptions);
return Self.rawSql(
`SELECT b.*
FROM cache.last_buy lb
JOIN buy b ON b.id = lb.buy_id
WHERE lb.item_id = ?
ORDER BY (lb.warehouse_id = ?) desc
LIMIT 1`,
[itemFk, warehouseFk], myOptions
);
};
};

View File

@ -0,0 +1,46 @@
module.exports = Self => {
Self.remoteMethodCtx('cloneEntry', {
description: 'Clones an entry',
accessType: 'WRITE',
accepts: [{
arg: 'id',
type: 'number',
required: true,
description: 'The entry id',
http: {source: 'path'}
}],
returns: {
type: 'object',
root: true
},
http: {
path: `/:id/cloneEntry`,
verb: 'POST'
}
});
Self.cloneEntry = async(ctx, id, options) => {
const userId = ctx.req.accessToken.userId;
const myOptions = {userId};
let tx;
if (typeof options == 'object')
Object.assign(myOptions, options);
if (!myOptions.transaction) {
tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
try {
await Self.rawSql('CALL entry_clone(?, @newEntryId)', [id], myOptions);
const result = await Self.rawSql('SELECT @newEntryId', [], myOptions);
const newEntryId = result[0]['@newEntryId'];
if (tx) await tx.commit();
return newEntryId;
} catch (e) {
if (tx) await tx.rollback();
throw e;
}
};
};

View File

@ -0,0 +1,48 @@
module.exports = Self => {
Self.remoteMethodCtx('deleteEntry', {
description: 'Clones an entry',
accessType: 'WRITE',
accepts: [{
arg: 'id',
type: 'number',
required: true,
description: 'The entry id',
http: {source: 'path'}
}],
http: {
path: `/:id/deleteEntry`,
verb: 'POST'
}
});
Self.deleteEntry = async(ctx, id, options) => {
const userId = ctx.req.accessToken.userId;
const myOptions = {userId};
let tx;
if (typeof options == 'object')
Object.assign(myOptions, options);
if (!myOptions.transaction) {
tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
try {
const entry = await Self.findById(id, null, myOptions);
await entry.updateAttribute('travelFk', null, myOptions);
await Self.rawSql('DELETE FROM vn.duaEntry WHERE entryFk = ?;', [id], myOptions);
await Self.rawSql(`
DELETE i.*
FROM vn.invoiceIn i
JOIN vn.entry e ON e.invoiceInFk = i.id
WHERE e.id = ?`, [id], myOptions
);
if (tx) await tx.commit();
} catch (e) {
if (tx) await tx.rollback();
throw e;
}
};
};

View File

@ -129,7 +129,68 @@ module.exports = Self => {
arg: 'finalTemperature', arg: 'finalTemperature',
type: 'number', type: 'number',
description: 'Final temperature value' description: 'Final temperature value'
} },
{
arg: 'isExcludedFromAvailable',
type: 'boolean',
description: `landing date`
},
{
arg: 'isReceived',
type: 'boolean',
description: `travel received`
},
{
arg: 'isRaid',
type: 'boolean',
description: `travel isRaid`
},
{
arg: 'landed',
type: 'date',
description: `landing date`
},
{
arg: 'invoiceNumber',
type: 'string',
description: `entry invoice`
},
{
arg: 'reference',
type: 'string',
description: `entry reference`
},
{
arg: 'awbCode',
type: 'string',
description: `awb code`
},
{
arg: 'agencyModeId',
type: 'number',
description: `agency mode id`
},
{
arg: 'evaNotes',
type: 'string',
description: `observation`
},
{
arg: 'warehouseInFk',
type: 'number',
description: `warehouse in id`
},
{
arg: 'warehouseOutFk',
type: 'number',
description: `warehouse out id`
},
{
arg: 'entryTypeCode',
type: 'string',
description: 'entry type code'
},
], ],
returns: { returns: {
type: ['object'], type: ['object'],
@ -156,19 +217,12 @@ module.exports = Self => {
{'s.name': {like: `%${value}%`}}, {'s.name': {like: `%${value}%`}},
{'s.nickname': {like: `%${value}%`}} {'s.nickname': {like: `%${value}%`}}
]}; ]};
case 'invoiceNumber':
case 'reference':
case 'ref': case 'ref':
case 'evaNotes':
param = `e.${param}`; param = `e.${param}`;
return {[param]: {like: `%${value}%`}}; return {[param]: {like: `%${value}%`}};
case 'created':
return {'e.created': {gte: value}};
case 'from':
return {'t.landed': {gte: value}};
case 'fromShipped':
return {'t.shipped': {gte: value}};
case 'to':
return {'t.landed': {lte: value}};
case 'toShipped':
return {'t.shipped': {lte: value}};
case 'id': case 'id':
case 'isBooked': case 'isBooked':
case 'isConfirmed': case 'isConfirmed':
@ -178,8 +232,20 @@ module.exports = Self => {
case 'currencyFk': case 'currencyFk':
case 'supplierFk': case 'supplierFk':
case 'invoiceInFk': case 'invoiceInFk':
param = `e.${param}`; case 'isExcludedFromAvailable':
return {[param]: value}; return {[`e.${param}`]: value};
case 'isReceived':
case 'landed':
case 'isRaid':
case 'warehouseInFk':
case 'warehouseOutFk':
return {[`t.${param}`]: value};
case 'awbCode':
return {'a.code': {like: `%${value}%`}};
case 'agencyModeId':
return {[`am.id`]: value};
case 'entryTypeCode':
return {[`et.code`]: value};
case 'initialTemperature': case 'initialTemperature':
return {'e.initialTemperature': {lte: value}}; return {'e.initialTemperature': {lte: value}};
case 'finalTemperature': case 'finalTemperature':
@ -197,15 +263,14 @@ module.exports = Self => {
const stmts = []; const stmts = [];
let stmt; let stmt;
stmt = new ParameterizedSQL( stmt = new ParameterizedSQL(
`SELECT `SELECT e.id,
e.id,
e.supplierFk, e.supplierFk,
e.dated, e.dated,
e.reference, e.reference,
e.invoiceNumber, e.invoiceNumber,
e.isBooked, e.isBooked,
e.isExcludedFromAvailable, e.isExcludedFromAvailable,
e.evaNotes observation, e.evaNotes,
e.isConfirmed, e.isConfirmed,
e.isOrdered, e.isOrdered,
t.isRaid, t.isRaid,
@ -227,15 +292,27 @@ module.exports = Self => {
cu.code currencyCode, cu.code currencyCode,
t.shipped, t.shipped,
t.landed, t.landed,
t.ref AS travelRef, t.ref travelRef,
t.warehouseInFk, t.warehouseInFk,
w.name warehouseInName w.name warehouseInName,
t.warehouseOutFk,
w2.name warehouseOutName,
a.code awbCode,
am.id agencyModeId,
am.name agencyModeName,
et.code entryTypeCode,
et.description entryTypeDescription,
t.isReceived
FROM vn.entry e FROM vn.entry e
JOIN vn.supplier s ON s.id = e.supplierFk JOIN vn.supplier s ON s.id = e.supplierFk
JOIN vn.travel t ON t.id = e.travelFk LEFT JOIN vn.travel t ON t.id = e.travelFk
JOIN vn.warehouse w ON w.id = t.warehouseInFk LEFT JOIN vn.warehouse w ON w.id = t.warehouseInFk
JOIN vn.company co ON co.id = e.companyFk LEFT JOIN vn.warehouse w2 ON w2.id = t.warehouseOutFk
JOIN vn.currency cu ON cu.id = e.currencyFk` LEFT JOIN vn.company co ON co.id = e.companyFk
LEFT JOIN vn.currency cu ON cu.id = e.currencyFk
LEFT JOIN vn.awb a ON a.id = t.awbFk
LEFT JOIN vn.agencyMode am ON am.id = t.agencyModeFk
LEFT JOIN vn.entryType et ON et.code = e.typeFk`
); );
stmt.merge(conn.makeWhere(filter.where)); stmt.merge(conn.makeWhere(filter.where));

View File

@ -0,0 +1,49 @@
module.exports = Self => {
Self.remoteMethodCtx('recalcEntryPrices', {
description: 'Clones an entry',
accessType: 'WRITE',
accepts: [{
arg: 'entryFk',
type: 'number',
required: true,
description: 'The entry id',
http: {source: 'path'}
}],
returns: {
type: 'object',
root: true
},
http: {
path: `/:entryFk/recalcEntryPrices`,
verb: 'POST'
}
});
Self.recalcEntryPrices = async(ctx, entryFk, options) => {
const userId = ctx.req.accessToken.userId;
const myOptions = {userId};
let tx;
if (typeof options == 'object')
Object.assign(myOptions, options);
if (!myOptions.transaction) {
tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
const entry = await Self.findById(entryFk, myOptions);
const entryConfig = await Self.app.models.EntryConfig.findOne({}, myOptions);
if (entry.supplierFk === entryConfig.inventorySupplierFk) return;
try {
const result = await Self.rawSql('CALL vn.buy_recalcPricesByEntry(?)', [entryFk], myOptions);
if (tx) await tx.commit();
return result[0];
} catch (e) {
if (tx) await tx.rollback();
throw e;
}
};
};

View File

@ -31,5 +31,8 @@
}, },
"InventoryConfig": { "InventoryConfig": {
"dataSource": "vn" "dataSource": "vn"
},
"EntryConfig": {
"dataSource": "vn"
} }
} }

View File

@ -0,0 +1,30 @@
{
"name": "EntryConfig",
"base": "VnModel",
"mixins": {
"Loggable": true
},
"options": {
"mysql": {
"table": "entryConfig"
}
},
"properties": {
"defaultEntry": {
"type": "number",
"id": true
},
"mailToNotify": {
"type": "string"
},
"inventorySupplierFk": {
"type": "number"
},
"maxLockTime": {
"type": "number"
},
"defaultSupplierFk": {
"type": "number"
}
}
}

View File

@ -15,8 +15,13 @@ module.exports = Self => {
require('../methods/entry/transfer')(Self); require('../methods/entry/transfer')(Self);
require('../methods/entry/labelSupplier')(Self); require('../methods/entry/labelSupplier')(Self);
require('../methods/entry/buyLabelSupplier')(Self); require('../methods/entry/buyLabelSupplier')(Self);
require('../methods/entry-buys/getBuyList')(Self);
require('../methods/entry-buys/getBuyUltimate')(Self);
require('../methods/entry/cloneEntry')(Self);
require('../methods/entry/deleteEntry')(Self);
require('../methods/entry/recalcEntryPrices')(Self);
Self.observe('before save', async function(ctx, options) { Self.observe('before save', async(ctx, options) => {
if (ctx.isNewInstance) return; if (ctx.isNewInstance) return;
const changes = ctx.data || ctx.instance; const changes = ctx.data || ctx.instance;

View File

@ -56,8 +56,7 @@
"required": true "required": true
}, },
"travelFk": { "travelFk": {
"type": "number", "type": "number"
"required": true
}, },
"companyFk": { "companyFk": {
"type": "number", "type": "number",
@ -74,6 +73,12 @@
}, },
"finalTemperature": { "finalTemperature": {
"type": "number" "type": "number"
},
"lockerUserFk":{
"type": "number"
},
"locked":{
"type": "date"
} }
}, },
"relations": { "relations": {
@ -107,6 +112,16 @@
"type": "belongsTo", "type": "belongsTo",
"model": "EntryType", "model": "EntryType",
"foreignKey": "typeFk" "foreignKey": "typeFk"
} },
"invoiceIn": {
"type": "belongsTo",
"model": "InvoiceIn",
"foreignKey": "invoiceInFk"
},
"user": {
"type": "belongsTo",
"model": "VnUser",
"foreignKey": "lockerUserFk"
}
} }
} }

View File

@ -0,0 +1,43 @@
module.exports = Self => {
Self.remoteMethodCtx('getSimilar', {
description: 'Returns list of items with similar item requested',
accessType: 'READ',
accepts: [
{
arg: 'filter',
type: 'Object',
required: true,
description: 'Filter defining where and paginated data',
http: {source: 'query'}
}
],
returns: {
type: ['Object'],
root: true
},
http: {
path: `/getSimilar`,
verb: 'GET'
}
});
Self.getSimilar = async(ctx, filter, options) => {
const myOptions = {userId: ctx.req.accessToken.userId};
if (typeof options == 'object')
Object.assign(myOptions, options);
const {where} = filter;
const query = [
filter.itemFk,
where.warehouseFk,
where.date,
where.showType,
where.scopeDays
];
const [results] = await Self.rawSql('CALL vn.item_getSimilar(?, ?, ?, ?, ?)', query, myOptions);
return results;
};
};

View File

@ -0,0 +1,38 @@
const ParameterizedSQL = require('loopback-connector').ParameterizedSQL;
module.exports = Self => {
Self.remoteMethodCtx('search', {
description: 'Returns an array of search results for a specified item',
accepts: [{
arg: 'filter',
type: 'object',
description: 'Filter to define conditions and paginate the data.',
required: true
}],
returns: {
type: ['object'],
root: true
},
http: {
path: `/search`,
verb: 'GET'
}
});
Self.search = async(ctx, filter) => {
const conn = Self.dataSource.connector;
const stmt = new ParameterizedSQL(`
SELECT *
FROM(
SELECT i.id, i.name, i.size, p.name producerName
FROM item i
LEFT JOIN producer p ON p.id = i.producerFk
) sub
`);
stmt.merge(conn.makeSuffix(filter));
return conn.executeStmt(stmt);
};
};

View File

@ -89,7 +89,7 @@ describe('item filter()', () => {
const ctx = {args: {filter: filter, workerFk: 16}, req: {accessToken: {userId: 1}}}; const ctx = {args: {filter: filter, workerFk: 16}, req: {accessToken: {userId: 1}}};
const result = await models.Item.filter(ctx, filter, options); const result = await models.Item.filter(ctx, filter, options);
expect(result.length).toEqual(2); expect(result.length).toEqual(3);
expect(result[0].id).toEqual(16); expect(result[0].id).toEqual(16);
expect(result[1].id).toEqual(71); expect(result[1].id).toEqual(71);

View File

@ -0,0 +1,49 @@
const models = require('vn-loopback/server/server').models;
describe('Item get similar', () => {
let options;
let tx;
const ctx = beforeAll.getCtx();
beforeAll.mockLoopBackContext();
beforeEach(async() => {
tx = await models.Item.beginTransaction({});
options = {transaction: tx};
});
afterEach(async() => {
if (tx)
await tx.rollback();
});
it('should return similar items', async() => {
const filter = {
itemFk: 88, sales: 43,
where: {
'scopeDays': '2',
'showType': true,
'alertLevelCode': 'FREE',
'date': '2001-01-01T11:00:00.000Z',
'warehouseFk': 1
}
};
const result = await models.Item.getSimilar(ctx, filter, options);
expect(result.length).toEqual(2);
});
it('should return empty array is if not exists', async() => {
const filter = {
itemFk: 88, sales: 43,
where: {
'scopeDays': '2',
'showType': true,
'alertLevelCode': 'FREE',
'date': '2001-01-01T11:00:00.000Z',
'warehouseFk': 60
}
};
const result = await models.Item.getSimilar(ctx, filter, options);
expect(result.length).toEqual(0);
});
});

View File

@ -26,7 +26,7 @@ describe('tag filterValue()', () => {
const filter = {where: {value: 'Blue'}, limit: 5}; const filter = {where: {value: 'Blue'}, limit: 5};
const result = await models.Tag.filterValue(colorTagId, filter, options); const result = await models.Tag.filterValue(colorTagId, filter, options);
expect(result.length).toEqual(2); expect(result.length).toEqual(3);
expect(result[0].value).toEqual('Blue'); expect(result[0].value).toEqual('Blue');
expect(result[1].value).toEqual('Blue/Silver'); expect(result[1].value).toEqual('Blue/Silver');

View File

@ -17,6 +17,9 @@
}, },
"showOrder": { "showOrder": {
"type": "number" "type": "number"
},
"hexJson": {
"type": "string"
} }
}, },
"acls": [ "acls": [

View File

@ -5,6 +5,7 @@ module.exports = Self => {
require('../methods/item/clone')(Self); require('../methods/item/clone')(Self);
require('../methods/item/updateTaxes')(Self); require('../methods/item/updateTaxes')(Self);
require('../methods/item/getBalance')(Self); require('../methods/item/getBalance')(Self);
require('../methods/item/getSimilar')(Self);
require('../methods/item/lastEntriesFilter')(Self); require('../methods/item/lastEntriesFilter')(Self);
require('../methods/item/getSummary')(Self); require('../methods/item/getSummary')(Self);
require('../methods/item/getCard')(Self); require('../methods/item/getCard')(Self);
@ -17,6 +18,7 @@ module.exports = Self => {
require('../methods/item/buyerWasteEmail')(Self); require('../methods/item/buyerWasteEmail')(Self);
require('../methods/item/setVisibleDiscard')(Self); require('../methods/item/setVisibleDiscard')(Self);
require('../methods/item/get')(Self); require('../methods/item/get')(Self);
require('../methods/item/search')(Self);
Self.validatesPresenceOf('originFk', {message: 'Cannot be blank'}); Self.validatesPresenceOf('originFk', {message: 'Cannot be blank'});

View File

@ -31,7 +31,7 @@ describe('route getSuggestedTickets()', () => {
const length = result.length; const length = result.length;
const anyResult = result[Math.floor(Math.random() * Math.floor(length))]; const anyResult = result[Math.floor(Math.random() * Math.floor(length))];
expect(result.length).toEqual(4); expect(result.length).toEqual(5);
expect(anyResult.zoneFk).toEqual(1); expect(anyResult.zoneFk).toEqual(1);
expect(anyResult.agencyModeFk).toEqual(8); expect(anyResult.agencyModeFk).toEqual(8);

View File

@ -14,7 +14,7 @@ describe('route unlink()', () => {
let tickets = await models.Route.getSuggestedTickets(routeId, options); let tickets = await models.Route.getSuggestedTickets(routeId, options);
expect(zoneAgencyModes.length).toEqual(4); expect(zoneAgencyModes.length).toEqual(4);
expect(tickets.length).toEqual(3); expect(tickets.length).toEqual(4);
await models.Route.unlink(agencyModeId, zoneId, options); await models.Route.unlink(agencyModeId, zoneId, options);

View File

@ -33,7 +33,7 @@
"observations": { "observations": {
"type": "string" "type": "string"
}, },
"userFk": { "editorFk": {
"type": "number" "type": "number"
}, },
"price": { "price": {

View File

@ -15,16 +15,13 @@
"roadmapFk": { "roadmapFk": {
"type": "number" "type": "number"
}, },
"addressFk": {
"type": "number"
},
"eta": { "eta": {
"type": "date" "type": "date"
}, },
"description": { "description": {
"type": "string" "type": "string"
}, },
"userFk": { "editorFk": {
"type": "number" "type": "number"
} }
}, },
@ -37,7 +34,7 @@
"address": { "address": {
"type": "belongsTo", "type": "belongsTo",
"model": "RoadmapAddress", "model": "RoadmapAddress",
"foreignKey": "addressFk" "foreignKey": "roadmapAddressFk"
} }
} }
} }

View File

@ -7,6 +7,6 @@ describe('Supplier getItemsPackaging()', () => {
expect(item.id).toEqual(1); expect(item.id).toEqual(1);
expect(item.name).toEqual('Ranged weapon longbow 200cm'); expect(item.name).toEqual('Ranged weapon longbow 200cm');
expect(item.quantity).toEqual(5000); expect(item.quantity).toEqual(5000);
expect(item.quantityTotal).toEqual(5100); expect(item.quantityTotal).toEqual(5200);
}); });
}); });

View File

@ -49,7 +49,7 @@ module.exports = Self => {
ps.monitorId, ps.monitorId,
e.created e.created
FROM expedition e FROM expedition e
JOIN host h ON Convert(h.code USING utf8mb3) COLLATE utf8mb3_unicode_ci = e.hostFk JOIN host h ON h.code = e.hostFk
JOIN packingSite ps ON ps.hostFk = h.id JOIN packingSite ps ON ps.hostFk = h.id
WHERE e.id = ?;`; WHERE e.id = ?;`;
const [expedition] = await models.Expedition.rawSql(query, [id]); const [expedition] = await models.Expedition.rawSql(query, [id]);

View File

@ -44,12 +44,14 @@ module.exports = Self => {
ps.monitorId, ps.monitorId,
e.created e.created
FROM expedition e FROM expedition e
JOIN host h ON Convert(h.code USING utf8mb3) COLLATE utf8mb3_unicode_ci = e.hostFk JOIN host h ON h.code = e.hostFk
JOIN packingSite ps ON ps.hostFk = h.id JOIN packingSite ps ON ps.hostFk = h.id
WHERE e.id = ?;`; WHERE e.id = ?;`;
const [expedition] = await models.PackingSiteConfig.rawSql(query, [id]);
const [expedition] = await models.PackingSiteConfig.rawSql(query, [id], myOptions);
if (!from && !expedition) return []; if (!from && !expedition) return [];
let start = new Date(expedition.created); let start = new Date(expedition.created);
let end = new Date(start.getTime() + (packingSiteConfig.avgBoxingTime * 1000)); let end = new Date(start.getTime() + (packingSiteConfig.avgBoxingTime * 1000));
@ -57,9 +59,13 @@ module.exports = Self => {
start.setHours(from, 0, 0); start.setHours(from, 0, 0);
end.setHours(to, 0, 0); end.setHours(to, 0, 0);
} }
const offset = start.getTimezoneOffset(); const offset = start.getTimezoneOffset();
start = new Date(start.getTime() - (offset * 60 * 1000)); start = new Date(start.getTime() - (offset * 60 * 1000));
end = new Date(end.getTime() - (offset * 60 * 1000)); end = new Date(end.getTime() - (offset * 60 * 1000));
const minutes = start.getMinutes();
const roundedMinutes = minutes - (minutes % 15);
start.setMinutes(roundedMinutes, 0, 0);
const videoUrl = const videoUrl =
`/${packingSiteConfig.shinobiToken}/videos/${packingSiteConfig.shinobiGroupKey}/${expedition.monitorId}`; `/${packingSiteConfig.shinobiToken}/videos/${packingSiteConfig.shinobiGroupKey}/${expedition.monitorId}`;
@ -73,6 +79,7 @@ module.exports = Self => {
} catch (e) { } catch (e) {
return []; return [];
} }
return response.data.videos.map(video => video.filename); return response.data.videos.map(video => video.filename);
}; };
}; };

View File

@ -2,35 +2,28 @@ const models = require('vn-loopback/server/server').models;
const axios = require('axios'); const axios = require('axios');
describe('boxing getVideoList()', () => { describe('boxing getVideoList()', () => {
it('should return video list', async() => { let tx;
const tx = await models.PackingSiteConfig.beginTransaction({}); let options;
try { beforeEach(async() => {
const options = {transaction: tx}; tx = await models.PackingSiteConfig.beginTransaction({});
options = {transaction: tx};
});
const id = 1; afterEach(async() => {
const from = 1; await tx.rollback();
const to = 2; });
const response = { it('should make the correct API call', async() => {
data: { const expedition = await models.Expedition.findById(15, null, options);
videos: [{ await expedition.updateAttribute('created', '2000-12-01 07:07:00', options);
id: 1,
filename: 'video1.mp4'
}]
}
};
spyOn(axios, 'get').and.returnValue(new Promise(resolve => resolve(response))); const axiosSpy = spyOn(axios, 'get').and.callThrough();
await models.Boxing.getVideoList(expedition.id, undefined, undefined, options);
const result = await models.Boxing.getVideoList(id, from, to, options); const expectedStartTime = '2000-12-01T07:00:00';
const calledUrl = axiosSpy.calls.mostRecent().args[0];
expect(result[0]).toEqual(response.data.videos[0].filename); expect(calledUrl).toContain(`start=${expectedStartTime}`);
await tx.rollback();
} catch (e) {
await tx.rollback();
throw e;
}
}); });
}); });

View File

@ -0,0 +1,99 @@
module.exports = Self => {
Self.remoteMethodCtx('replaceItem', {
description: 'Replace item from sale',
accessType: 'WRITE',
accepts: [
{
arg: 'saleFk',
type: 'number',
required: true,
},
{
arg: 'substitutionFk',
type: 'number',
required: true
},
{
arg: 'quantity',
type: 'number',
required: true
}
],
returns: {
type: 'object',
root: true
},
http: {
path: `/replaceItem`,
verb: 'POST'
}
});
Self.replaceItem = async(ctx, saleFk, substitutionFk, quantity, options) => {
const myOptions = {userId: ctx.req.accessToken.userId};
let tx;
const $t = ctx.req.__;
const models = Self.app.models;
if (typeof options == 'object')
Object.assign(myOptions, options);
if (!myOptions.transaction) {
tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
try {
const replaceItemQuery = {
sql: 'CALL sale_replaceItem(?,?,?)',
query: [saleFk, substitutionFk, quantity]
};
const resultReplaceItem = await Self.rawSql(replaceItemQuery.sql, replaceItemQuery.query, myOptions);
const sale = await models.Sale.findById(saleFk, {
fields: ['id', 'ticketFk', 'itemFk', 'quantity', 'price'],
include: [
{
relation: 'ticket',
scope: {
fields: ['id']
},
}, {
relation: 'item',
scope: {
fields: ['id', 'name', 'longName']
}
}
]
}, myOptions);
const salesPersonQuery = {
sql: 'SELECT vn.client_getSalesPersonByTicket(?)',
query: [sale.ticketFk]
};
const salesPerson = await Self.rawSql(salesPersonQuery.sql, salesPersonQuery.query, myOptions);
const url = await models.Url.getUrl();
const substitution = await models.Item.findById(substitutionFk, {
fields: ['id', 'name', 'longName']
}, myOptions);
const message = $t('negativeReplaced', {
oldItemId: sale.itemFk,
oldItem: sale.item().longName,
oldItemUrl: `${url}item/${sale.itemFk}/summary`,
newItemId: substitution.id,
newItem: substitution.longName,
newItemUrl: `${url}item/${substitution.id}/summary`,
ticketId: sale.ticketFk,
ticketUrl: `${url}ticket/${sale.ticketFk}/sale`
});
await models.Chat.sendCheckingPresence(ctx, salesPerson.id, message);
return resultReplaceItem;
} catch (e) {
if (tx) await tx.rollback();
throw e;
}
};
};

View File

@ -0,0 +1,61 @@
const {models} = require('vn-loopback/server/server');
describe('Sale - replaceItem function', () => {
let options;
let tx;
const ctx = beforeAll.getCtx();
beforeAll.mockLoopBackContext();
beforeEach(async() => {
tx = await models.Sale.beginTransaction({});
options = {transaction: tx};
});
afterEach(async() => {
if (tx)
await tx.rollback();
});
it('should replace full item in sale and send notification', async() => {
const saleFk = 43;
const substitutionFk = 3;
const quantity = 15;
const ticketFk = 1000000;
const salesBefore = await models.Sale.find({where: {ticketFk}}, options);
const salesLength = salesBefore.length;
expect(1).toEqual(salesBefore.length);
await models.Sale.replaceItem(ctx, saleFk, substitutionFk, quantity, options);
const salesAfter = await models.Sale.find({where: {ticketFk}}, options);
expect(salesLength).toBeLessThan(salesAfter.length);
expect(salesAfter[0].id).toEqual(saleFk);
expect(salesAfter[salesLength].itemFk).toEqual(substitutionFk);
expect(salesAfter[salesLength].quantity).toEqual(quantity);
expect(salesAfter[0].quantity).toEqual(0);
expect(salesAfter[salesLength].concept).toMatch(/^\+/);
});
it('should replace half item in sale and send notification', async() => {
const saleFk = 43;
const substitutionFk = 3;
const quantity = 10;
const ticketFk = 1000000;
const salesBefore = await models.Sale.find({where: {ticketFk}}, options);
const salesLength = salesBefore.length;
expect(1).toEqual(salesBefore.length);
await models.Sale.replaceItem(ctx, saleFk, substitutionFk, quantity, options);
const salesAfter = await models.Sale.find({where: {ticketFk}}, options);
expect(salesLength).toBeLessThan(salesAfter.length);
expect(salesAfter[0].id).toEqual(saleFk);
expect(salesAfter[salesLength].itemFk).toEqual(substitutionFk);
expect(salesAfter[salesLength].quantity).toEqual(quantity);
expect(salesAfter[0].quantity).toEqual(5);
expect(salesAfter[salesLength].concept).toMatch(/^\+/);
});
});

View File

@ -1,4 +1,5 @@
const closure = require('./closure'); const smtp = require('vn-print/core/smtp');
const config = require('vn-print/core/config');
module.exports = Self => { module.exports = Self => {
Self.remoteMethodCtx('closeAll', { Self.remoteMethodCtx('closeAll', {
@ -25,122 +26,62 @@ module.exports = Self => {
Self.closeAll = async(ctx, options) => { Self.closeAll = async(ctx, options) => {
const userId = ctx.req.accessToken.userId; const userId = ctx.req.accessToken.userId;
const myOptions = {userId}; const myOptions = {userId};
let tx;
if (typeof options == 'object') if (typeof options == 'object')
Object.assign(myOptions, options); Object.assign(myOptions, options);
let tx; if (!myOptions.transaction) {
// IMPORTANT: Due to its high cost in production, wrapping this process in a transaction may cause timeouts. tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
const toDate = Date.vnNew(); const toDate = Date.vnNew();
toDate.setHours(0, 0, 0, 0); toDate.setHours(0, 0, 0, 0);
toDate.setDate(toDate.getDate() - 1); toDate.setDate(toDate.getDate() - 1);
const [{dateFrom, dateTo}] = await Self.rawSql(`
SELECT ? - INTERVAL closureDaysAgo DAY dateFrom,
util.dayEnd(?) dateTo
FROM ticketConfig
LIMIT 1`, [toDate, toDate], myOptions);
const tickets = await Self.rawSql(` await Self.rawSql(`
SELECT t.id, DROP TEMPORARY TABLE IF EXISTS tmp.ticket_close;
CREATE TEMPORARY TABLE tmp.ticket_close
ENGINE = MEMORY
WITH wTickets AS(
SELECT t.id ticketFk
FROM ticket t
JOIN warehouse wh ON wh.id = t.warehouseFk AND wh.hasComission
WHERE t.shipped BETWEEN ? AND ?
AND t.refFk IS NULL
), wTicketsTracking AS(
SELECT wt.ticketFk, MAX(tt.id) maxTracking
FROM wTickets wt
JOIN ticketTracking tt ON tt.ticketFk = wt.ticketFk
GROUP BY tt.ticketFk
), wTicketsLastState AS(
SELECT wt.ticketFk, tt.stateFk
FROM wTicketsTracking wt
JOIN ticketTracking tt ON tt.id = wt.maxTracking
) SELECT tls.ticketFk,
t.clientFk, t.clientFk,
t.companyFk,
c.id clientFk,
c.name clientName, c.name clientName,
c.email recipient, c.email recipient,
c.salesPersonFk,
c.isToBeMailed,
c.hasToInvoice,
c.hasDailyInvoice,
eu.email salesPersonEmail, eu.email salesPersonEmail,
t.addressFk t.addressFk,
FROM ticket t c.hasDailyInvoice,
c.hasToInvoiceByAddress,
t.totalWithVat,
t.companyFk
FROM wTicketsLastState tls
JOIN ticket t ON t.id = tls.ticketFk
JOIN state s ON s.id =tls.stateFk
JOIN alertLevel al ON al.id = s.alertLevel
JOIN agencyMode am ON am.id = t.agencyModeFk JOIN agencyMode am ON am.id = t.agencyModeFk
JOIN warehouse wh ON wh.id = t.warehouseFk AND wh.hasComission
JOIN ticketState ts ON ts.ticketFk = t.id
JOIN alertLevel al ON al.id = ts.alertLevel
JOIN client c ON c.id = t.clientFk JOIN client c ON c.id = t.clientFk
JOIN province p ON p.id = c.provinceFk
JOIN country co ON co.id = p.countryFk
LEFT JOIN account.emailUser eu ON eu.userFk = c.salesPersonFk LEFT JOIN account.emailUser eu ON eu.userFk = c.salesPersonFk
JOIN ticketConfig tc ON TRUE WHERE (al.code = 'PACKED' OR (am.code = 'refund' AND al.code <> 'delivered'));
WHERE (al.code = 'PACKED' OR (am.code = 'refund' AND al.code <> 'delivered')) CALL ticket_close();
AND t.shipped BETWEEN ? - INTERVAL tc.closureDaysAgo DAY AND util.dayEnd(?) `, [dateFrom, dateTo], myOptions);
AND t.refFk IS NULL
GROUP BY t.id
`, [toDate, toDate], myOptions);
const ticketIds = tickets.map(ticket => ticket.id);
await Self.rawSql(`
INSERT INTO util.debug (variable, value)
VALUES ('nightInvoicing', ?)
`, [ticketIds.join(',')], myOptions);
await Self.rawSql(`
WITH ticketNotInvoiceable AS(
SELECT JSON_OBJECT(
'tickets',
JSON_ARRAYAGG(
JSON_OBJECT(
'ticketId', ticketFk,
'reason', reason,
'clientId', clientFk
)
)
)errors
FROM (
SELECT ticketFk,
CONCAT_WS(', ',
IF(hasErrorToInvoice, 'Facturar', NULL),
IF(hasErrorTaxDataChecked, 'Datos comprobados', NULL),
IF(hasErrorDeleted, 'Eliminado', NULL),
IF(hasErrorItemTaxCountry, 'Impuesto no informado', NULL),
IF(hasErrorAddress, 'Sin dirección', NULL),
IF(hasErrorInfoTaxAreaWorld, 'Datos exportaciones', NULL)) reason,
clientFk
FROM (
SELECT t.id ticketFk,
SUM(NOT c.hasToInvoice) hasErrorToInvoice,
SUM(NOT c.isTaxDataChecked) hasErrorTaxDataChecked,
SUM(t.isDeleted) hasErrorDeleted,
SUM(itc.id IS NULL) hasErrorItemTaxCountry,
SUM(a.id IS NULL) hasErrorAddress,
SUM(ios.code IS NOT NULL
AND(ad.customsAgentFk IS NULL
OR ad.incotermsFk IS NULL)) hasErrorInfoTaxAreaWorld,
t.clientFk clientFk
FROM ticket t
LEFT JOIN address ad ON ad.id = t.addressFk
JOIN sale s ON s.ticketFk = t.id
JOIN item i ON i.id = s.itemFk
JOIN supplier su ON su.id = t.companyFk
JOIN agencyMode am ON am.id = t.agencyModeFk
JOIN warehouse wh ON wh.id = t.warehouseFk AND wh.hasComission
JOIN ticketState ts ON ts.ticketFk = t.id
JOIN alertLevel al ON al.id = ts.alertLevel
JOIN client c ON c.id = t.clientFk
JOIN province p ON p.id = c.provinceFk
JOIN ticketConfig tc ON TRUE
LEFT JOIN autonomy a ON a.id = p.autonomyFk
JOIN country co ON co.id = p.countryFk
LEFT JOIN account.emailUser eu ON eu.userFk = c.salesPersonFk
LEFT JOIN itemTaxCountry itc ON itc.itemFk = i.id
AND itc.countryFk = su.countryFk
LEFT JOIN vn.invoiceOutSerial ios ON ios.taxAreaFk = 'WORLD'
AND ios.code = invoiceSerial(t.clientFk, t.companyFk, 'multiple')
WHERE (al.code = 'PACKED' OR (am.code = 'refund' AND al.code <> 'delivered'))
AND t.shipped BETWEEN ? - INTERVAL tc.closureDaysAgo DAY AND util.dayEnd(?)
AND t.refFk IS NULL
AND c.hasDailyInvoice
GROUP BY ticketFk
HAVING hasErrorToInvoice
OR hasErrorTaxDataChecked
OR hasErrorDeleted
OR hasErrorItemTaxCountry
OR hasErrorAddress
OR hasErrorInfoTaxAreaWorld
)sub
)sub2
) SELECT IF(errors = '{"tickets": null}',
'No errors',
util.notification_send('invoice-ticket-closure', errors, NULL))
FROM ticketNotInvoiceable`, [toDate, toDate], myOptions);
await closure(ctx, Self, tickets, myOptions);
await Self.rawSql(` await Self.rawSql(`
UPDATE ticket t UPDATE ticket t
@ -151,17 +92,100 @@ module.exports = Self => {
JOIN ticketConfig tc ON TRUE JOIN ticketConfig tc ON TRUE
LEFT JOIN ticketObservation tob ON tob.ticketFk = t.id LEFT JOIN ticketObservation tob ON tob.ticketFk = t.id
SET t.routeFk = NULL SET t.routeFk = NULL
WHERE t.shipped BETWEEN ? - INTERVAL tc.closureDaysAgo DAY AND util.dayEnd(?) WHERE t.shipped BETWEEN ? AND ?
AND al.code NOT IN ('DELIVERED', 'PACKED') AND al.code NOT IN ('DELIVERED', 'PACKED')
AND NOT t.packages AND NOT t.packages
AND tob.id IS NULL AND tob.id IS NULL
AND t.routeFk`, [toDate, toDate], myOptions); AND t.routeFk`, [dateFrom, dateTo], myOptions);
const [clients] = await Self.rawSql(`
SELECT clientFk clientId,
clientName,
recipient,
salesPersonEmail,
addressFk addressId,
companyFk,
SUM(totalWithVat) total,
'quick' serialType
FROM tmp.ticket_close
WHERE hasDailyInvoice
GROUP BY IF (hasToInvoiceByAddress, addressFk, clientFk), companyFk
HAVING total > 0;
DROP TEMPORARY TABLE tmp.ticket_close;
`, [], myOptions);
if (tx) if (tx)
await tx.commit(); await tx.commit();
const failedClients = [];
const nestedTransaction = options?.transaction ? myOptions : {};
for (const client of clients) {
ctx.args = {
...client,
invoiceDate: Date.vnNew(),
maxShipped: toDate
};
try {
const id = await Self.app.models.InvoiceOut.invoiceClient(ctx, nestedTransaction);
if (id)
await Self.app.models.InvoiceOut.makePdfAndNotify(ctx, id, null, nestedTransaction);
} catch (error) {
await Self.rawSql(`
INSERT INTO util.debug (variable, value)
VALUES ('invoicingTicketError', ?)
`, [client.clientId + ' - ' + error]);
if (error.responseCode == 450) {
await invalidEmail(client);
continue;
}
failedClients.push({
id: client.clientId,
address: client.addressId,
error
});
}
}
if (failedClients.length > 0) {
let body = 'This following tickets have failed:<br/><br/>';
for (const client of failedClients) {
body += `Client: <strong>${client.id}</strong>
Address: <strong>${client.address}</strong>
<br/> <strong>${client.error}</strong><br/><br/>`;
}
smtp.send({
to: config.app.reportEmail,
subject: '[API] Nightly ticket closure report',
html: body,
}).catch(err => console.error(err));
}
return { return {
message: 'Success' message: 'Success'
}; };
}; };
async function invalidEmail(client) {
await Self.rawSql(
`UPDATE client SET email = NULL WHERE id = ?`,
[client.clientId],
myOptions
);
const body = `No se ha podido facturar al cliente <strong>${client.clientId} - ${client.clientName}</strong>
porque la dirección de email <strong>"${client.recipient}"</strong> no es correcta
o no está disponible.<br/><br/>
Para evitar que se repita este error, se ha eliminado la dirección de email de la ficha del cliente.
Actualiza la dirección de email con una correcta.`;
smtp.send({
to: client.salesPersonEmail,
subject: 'No se ha podido enviar el albarán',
html: body,
}).catch(err => console.error(err));
}
}; };

View File

@ -0,0 +1,111 @@
module.exports = Self => {
Self.remoteMethod('itemLack', {
description: 'Get tickets as negative status',
accessType: 'READ',
accepts: [
{
arg: 'ctx',
type: 'object',
http: {source: 'context'}
},
{
arg: 'filter',
type: 'object',
description: 'Filter defining where, order, offset, and limit - must be a JSON-encoded string',
http: {source: 'query'}
},
{
arg: 'id',
type: 'number',
description: 'The item id',
},
{
arg: 'longname',
type: 'string',
description: 'Article name',
},
{
arg: 'supplier',
type: 'string',
description: 'Supplier id',
},
{
arg: 'colour',
type: 'string',
description: 'Colour\'s item',
},
{
arg: 'size',
type: 'string',
description: 'Size\'s item',
},
{
arg: 'origen',
type: 'string',
description: 'origen id',
},
{
arg: 'warehouseFk',
type: 'number',
description: 'The warehouse id',
},
{
arg: 'lack',
type: 'number',
description: 'The item id',
},
{
arg: 'days',
type: 'number',
description: 'The range days',
}
],
returns: [
{
arg: 'body',
type: ['object'],
root: true
}
],
http: {
path: `/itemLack`,
verb: 'GET'
}
});
Self.itemLack = async(ctx, filter, options) => {
const myOptions = {};
if (typeof options == 'object')
Object.assign(myOptions, options);
const filterKeyOrder = [
'id', 'force', 'days', 'longname', 'supplier',
'colour', 'size', 'originFk',
'lack', 'warehouseFk'
];
delete ctx?.args?.ctx;
delete ctx?.args?.filter;
Object.assign(filter, ctx.args ?? {});
let procedureParams = [];
procedureParams.push(...filterKeyOrder.map(clave => filter[clave] ?? null));
// Default values
const forceIndex = filterKeyOrder.indexOf('force');
if (!procedureParams[forceIndex])procedureParams[forceIndex] = true;
const daysIndex = filterKeyOrder.indexOf('days');
if (!procedureParams[daysIndex])procedureParams[daysIndex] = 2;
const procedureArgs = Array(procedureParams.length).fill('?').join(', ');
let query = `CALL vn.item_getLack(${procedureArgs})`;
const result = await Self.rawSql(query, procedureParams, myOptions);
const itemsIndex = 0;
return result[itemsIndex];
};
};

View File

@ -0,0 +1,167 @@
const {ParameterizedSQL} = require('loopback-connector');
module.exports = Self => {
Self.remoteMethod('itemLackDetail', {
description: 'Retrieve detail from ticket as negative',
accessType: 'READ',
accepts: [
{
arg: 'itemFk',
type: 'number',
description: 'The item as negative status',
},
{
arg: 'filter',
type: 'object',
description: 'Filter defining where, order, offset, and limit - must be a JSON-encoded string',
http: {source: 'query'}
}
],
returns: [
{
arg: 'body',
type: ['object'],
root: true,
},
],
http: {
path: `/itemLack/:itemFk`,
verb: 'GET',
},
});
Self.itemLackDetail = async(itemFk, filter, options) => {
const conn = Self.dataSource.connector;
const myOptions = {};
if (typeof options == 'object') Object.assign(myOptions, options);
const vDated = (Date.vnNew());
vDated.setHours(0, 0, 0, 0);
const scopeDays = filter.where.scopeDays ?? 0;
let alertLevels = filter.where.alertLevelCode;
if (!alertLevels)
alertLevels = (await Self.app.models.AlertLevel.find({fields: ['code']})).map(({code}) => code);
const stmt = new ParameterizedSQL(`
SELECT s.id,
st.code,
t.id,
t.nickname,
c.id customerId,
t.shipped,
s.quantity,
ag.name,
ag.id agencyFk,
tls.alertLevel alertLevel,
st.name stateName,
s.id saleFk,
s.itemFk,
s.price price,
al.code alertLevelCode,
z.name zoneName,
z.id zoneFk,
z.hour theoreticalhour,
cn.isRookie,
sc.saleClonedFk turno,
tr.saleFk peticionCompra,
DATE_FORMAT(IF(HOUR(t.shipped), t.shipped, IF(zc.hour, zc.hour, z.hour)),'%H:%i') minTimed,
FALSE isBasket,
substitution.hasObservation,
(d.code = 'spainTeamVip') hasToIgnore
FROM sale s
LEFT JOIN saleGroupDetail sgd ON sgd.saleFk = s.id
JOIN ticket t ON t.id = s.ticketFk
LEFT JOIN zone z ON z.id = t.zoneFk
LEFT JOIN zoneClosure zc ON zc.zoneFk = t.zoneFk
AND t.shipped BETWEEN zc.dated AND util.dayEnd(t.shipped)
JOIN client c ON c.id=t.clientFk
LEFT JOIN bs.clientNewBorn cn ON cn.clientFk=c.id
JOIN agencyMode ag ON ag.id=t.agencyModeFk
JOIN ticketState tls ON tls.ticketFk=t.id
LEFT JOIN state st ON st.id=tls.state
LEFT JOIN alertLevel al ON al.id = st.alertLevel
LEFT JOIN saleCloned sc ON sc.saleClonedFk = s.id
LEFT JOIN ticketRequest tr ON tr.saleFk = s.id
LEFT JOIN workerDepartment wd ON wd.workerFk = c.salesPersonFk
LEFT JOIN department d ON d.id = wd.departmentFk
LEFT JOIN (
SELECT co.clientFk, COUNT(*) hasObservation
FROM clientObservation co
JOIN observationType ot ON ot.id = co.observationTypeFk
WHERE ot.code = 'substitution'
GROUP BY co.clientFk
) substitution ON substitution.clientFk = c.id
WHERE t.warehouseFk = ?
AND s.itemFk = ?
AND s.quantity <> 0
AND t.shipped BETWEEN ? AND (? + INTERVAL ? DAY)
AND sgd.saleFk IS NULL
AND (al.code IN (?) OR al.id IS NULL)
UNION ALL
SELECT r.id,
NULL,
r.orderFk,
c.name customerName,
c.id customerId,
r.shipment,
r.amount,
ag.name,
ag.id,
NULL,
NULL,
NULL,
r.itemFk,
NULL,
NULL,
NULL,
NULL,
NULL,
cn.isRookie,
NULL,
NULL,
NULL,
TRUE,
substitution.hasObservation,
d.code = 'spainTeamVip'
FROM hedera.orderRow r
JOIN hedera.order o ON o.id = r.orderFk
JOIN client c ON c.id = o.customer_id
JOIN agencyMode ag ON ag.id=o.agency_id
LEFT JOIN bs.clientNewBorn cn ON cn.clientFk=c.id
LEFT JOIN workerDepartment wd ON wd.workerFk = c.salesPersonFk
LEFT JOIN department d ON d.id = wd.departmentFk
LEFT JOIN (
SELECT co.clientFk, COUNT(*) hasObservation
FROM clientObservation co
JOIN observationType ot ON ot.id = co.observationTypeFk
WHERE ot.code = 'substitution'
GROUP BY co.clientFk
) substitution ON substitution.clientFk = c.id
WHERE r.shipment BETWEEN ? AND ? + INTERVAL ? DAY
AND r.created >= ?
AND r.warehouseFk = ?
AND NOT o.confirmed
AND r.itemFk = ?
AND r.amount
ORDER BY hasToIgnore, isBasket
`,
[
filter.where.warehouseFk,
itemFk,
vDated, vDated,
scopeDays,
alertLevels,
scopeDays,
vDated, vDated, vDated,
filter.where.warehouseFk,
itemFk
]);
const sql = ParameterizedSQL.join([stmt], ';');
const result = await conn.executeStmt(sql, myOptions);
return result;
};
};

View File

@ -0,0 +1,80 @@
const {models} = require('vn-loopback/server/server');
describe('Item Lack', () => {
let options;
let tx;
const ctx = beforeAll.getCtx();
beforeAll.mockLoopBackContext();
beforeEach(async() => {
tx = await models.Ticket.beginTransaction({});
options = {transaction: tx};
});
afterEach(async() => {
if (tx)
await tx.rollback();
});
it('should return data with NO filters', async() => {
const filter = {};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(2);
});
it('should return data with filter.id', async() => {
const filter = {
id: 5
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(1);
});
it('should return data with filter.longname', async() => {
const filter = {
longname: 'Ranged weapon pistol 9mm'
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(1);
});
it('should return data with filter.color', async() => {
const filter = {
colour: 'WHT'
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(1);
});
it('should return data with filter.origen', async() => {
const filter = {
originFk: 1
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(2);
});
it('should return data with filter.size', async() => {
const filter = {
size: '15'
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(1);
});
it('should return data with filter.lack', async() => {
const filter = {
lack: '-15'
};
const result = await models.Ticket.itemLack(ctx, filter, options);
expect(result.length).toEqual(1);
});
});

View File

@ -0,0 +1,55 @@
const models = require('vn-loopback/server/server').models;
describe('Item Lack Detail', () => {
it('should return false if id is null', async() => {
const tx = await models.Ticket.beginTransaction({});
try {
const options = {transaction: tx};
const itemFk = null;
const filter = {where: {warehouseFk: 60}};
const result = await models.Ticket.itemLackDetail(itemFk, filter, options);
expect(result.length).toEqual(0);
await tx.rollback();
} catch (e) {
await tx.rollback();
throw e;
}
});
it('should return data if id exists', async() => {
const tx = await models.Ticket.beginTransaction({});
try {
const options = {transaction: tx};
const itemFk = 1167;
const filter = {where: {warehouseFk: 60}};
const result = await models.Ticket.itemLackDetail(itemFk, filter, options);
expect(result.length).toEqual(0);
await tx.rollback();
} catch (e) {
await tx.rollback();
throw e;
}
});
it('should return error is if not exists', async() => {
const tx = await models.Ticket.beginTransaction({});
try {
const options = {transaction: tx};
const itemFk = 0;
const filter = {where: {warehouseFk: 60}};
const result = await models.Ticket.itemLackDetail(itemFk, filter, options);
expect(result.length).toEqual(0);
await tx.rollback();
} catch (e) {
await tx.rollback();
throw e;
}
});
});

View File

@ -0,0 +1,47 @@
const {models} = require('vn-loopback/server/server');
describe('Split', () => {
let options;
let tx;
const ctx = beforeAll.getCtx();
beforeAll.mockLoopBackContext();
beforeEach(async() => {
tx = await models.Ticket.beginTransaction({});
options = {transaction: tx};
});
afterEach(async() => {
if (tx)
await tx.rollback();
});
it('should split tickets with count 1', async() => {
const data =
{ticketFk: 7, sales: [1]};
const result = await models.Ticket.split(ctx, data, options);
expect(data.ticketFk).toEqual(result.ticket);
expect('noSplit').toEqual(result.status);
});
it('should split tickets with count 2 and error', async() => {
const data =
{ticketFk: 11, sales: [7]}
;
const result = await models.Ticket.split(ctx, data, options);
expect(data.ticketFk).toEqual(result.ticket);
expect('error').toEqual(result.status);
expect('Can\'t transfer claimed sales').toEqual(result.message);
});
it('should split tickets with count 2 and success', async() => {
const data =
{ticketFk: 14, sales: [33]};
const result = await models.Ticket.split(ctx, data, options);
expect(data.ticketFk).toEqual(result.ticket);
expect('split').toEqual(result.status);
});
});

View File

@ -0,0 +1,73 @@
module.exports = Self => {
Self.remoteMethodCtx('split', {
description: 'Split ticket with custom date',
accessType: 'WRITE',
accepts: [
{
arg: 'ticket',
type: 'Object',
required: true,
http: {source: 'body'}
},
{
arg: 'date',
type: 'date',
required: true,
}
],
returns: {
type: ['Object'],
root: true
},
http: {
path: `/split`,
verb: 'POST'
}
});
Self.split = async(ctx, ticket, options) => {
const {ticketFk} = ticket;
const models = Self.app.models;
const myOptions = {};
let tx;
let result = [];
if (typeof options == 'object')
Object.assign(myOptions, options);
if (!myOptions.transaction) {
tx = await Self.beginTransaction({});
myOptions.transaction = tx;
}
try {
const count = await models.Sale.count({
ticketFk
}, myOptions);
if (count === 1)
return {ticket: ticketFk, status: 'noSplit'};
const [, [{vNewTicket}]] = await Self.rawSql(`
CALL vn.ticket_clone(?, @vNewTicket);
SELECT @vNewTicket vNewTicket;`,
[ticketFk], myOptions);
if (vNewTicket === 0) return result;
const sales = await models.Sale.find({
where: {id: {inq: ticket.sales}}
}, myOptions);
const updateIsPicked = sales.map(({sid}) => Self.rawSql(`
UPDATE vn.sale SET isPicked = (id = ?) WHERE ticketFk = ?`,
[sid, ticketFk], myOptions));
await Promise.all(updateIsPicked);
await Self.transferSales(ctx, ticketFk, vNewTicket, sales, myOptions);
await Self.rawSql(`CALL vn.ticket_setState(?, ?)`, [ticketFk, 'FIXING'], myOptions);
if (tx) await tx.commit();
return {ticket: ticketFk, newTicket: vNewTicket, status: 'split'};
} catch (e) {
if (tx) await tx.rollback();
return {ticket: ticketFk, status: 'error', message: e.message};
}
};
};

View File

@ -43,8 +43,8 @@ module.exports = Self => {
const {code} = await models.State.findById(params.stateFk, {fields: ['code']}, myOptions); const {code} = await models.State.findById(params.stateFk, {fields: ['code']}, myOptions);
params.code = code; params.code = code;
} else { } else {
const {id} = await models.State.findOne({where: {code: params.code}}, myOptions); const state = await models.State.findOne({where: {id: params.code}}, myOptions);
params.stateFk = id; params.stateFk = state.id;
} }
if (!params.userFk) { if (!params.userFk) {

View File

@ -13,6 +13,7 @@ module.exports = Self => {
require('../methods/sale/usesMana')(Self); require('../methods/sale/usesMana')(Self);
require('../methods/sale/clone')(Self); require('../methods/sale/clone')(Self);
require('../methods/sale/getFromSectorCollection')(Self); require('../methods/sale/getFromSectorCollection')(Self);
require('../methods/sale/replaceItem')(Self);
Self.validatesPresenceOf('concept', { Self.validatesPresenceOf('concept', {
message: `Concept cannot be blank` message: `Concept cannot be blank`

View File

@ -26,6 +26,12 @@
}, },
"defaultAttenderFk": { "defaultAttenderFk": {
"type": "number" "type": "number"
},
"lackAlertPrice": {
"type": "number"
},
"lackScopeDays": {
"type": "number"
} }
}, },
"relations": { "relations": {

View File

@ -46,5 +46,8 @@ module.exports = function(Self) {
require('../methods/ticket/docuwareDownload')(Self); require('../methods/ticket/docuwareDownload')(Self);
require('../methods/ticket/myLastModified')(Self); require('../methods/ticket/myLastModified')(Self);
require('../methods/ticket/setWeight')(Self); require('../methods/ticket/setWeight')(Self);
require('../methods/ticket/itemLack')(Self);
require('../methods/ticket/itemLackDetail')(Self);
require('../methods/ticket/split')(Self);
require('../methods/ticket/getTicketProblems')(Self); require('../methods/ticket/getTicketProblems')(Self);
}; };

View File

@ -284,39 +284,6 @@
} }
} }
}, },
{
"relation": "business",
"scope": {
"fields": [
"id",
"started",
"ended",
"reasonEndFk",
"departmentFk",
"workerBusinessProfessionalCategoryFk"
],
"include":[
{
"relation": "department",
"scope": {
"fields": ["id", "name"]
}
},
{
"relation": "reasonEnd",
"scope": {
"fields": ["id", "reason"]
}
},
{
"relation": "workerBusinessProfessionalCategory",
"scope": {
"fields": ["id", "description"]
}
}
]
}
},
{ {
"relation": "boss", "relation": "boss",
"scope": { "scope": {

View File

@ -70,7 +70,7 @@
<td width="5%">{{sale.itemFk}}</td> <td width="5%">{{sale.itemFk}}</td>
<td class="number">{{sale.quantity}}</td> <td class="number">{{sale.quantity}}</td>
<td width="50%">{{sale.concept}}</td> <td width="50%">{{sale.concept}}</td>
<td width="5%" class="font light-gray" v-if="sale.subName">{{sale.subName}}</td> <td width="5%" class="font light-gray">{{sale.subName}}</td>
<td class="number" v-if="showPrices">{{sale.price | currency('EUR', $i18n.locale)}}</td> <td class="number" v-if="showPrices">{{sale.price | currency('EUR', $i18n.locale)}}</td>
<td class="centered" width="5%" v-if="showPrices">{{(sale.discount / 100) | percentage}}</td> <td class="centered" width="5%" v-if="showPrices">{{(sale.discount / 100) | percentage}}</td>
<td class="centered" v-if="showPrices">{{sale.vatType}}</td> <td class="centered" v-if="showPrices">{{sale.vatType}}</td>