โŒ

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

import DB: ERROR 1435 (HY000) at line 166292: Trigger in wrong schema

I am working on Hostinger with Woocommerce, then I "Create a new staging environment" on hPanel. I got failed status after few minutes. After contacting to the Hostinger Customer Support. They show me "import DB: ERROR 1435 (HY000) at line 166292: Trigger in wrong schema" at the first contact. Then I asking for more detailed information, they told me to export and import again and on my side it works fine when I tried to import to the staging database but still the staging website does not run. Any idea on what is this issue about "Trigger in wrong schema"? As my production database have no issue at all.

[Create Staging from hPanel](https://i.sstatic.net/QsLD3Qjn.png)

[Answer from customer support 1](https://i.sstatic.net/pzBUiWwf.png)

[Answer from customer support 2](https://i.sstatic.net/o0vdBvA4.png)

Yup validation not working for multiple object fields with identical types in a schema

I'm using Yup for form validation in my React application. I have a schema where I need to validate multiple object fields independently. However, I'm encountering an issue where Yup validation only works for one object field at a time, even though the shapes of the objects are identical and share the same type

const comparisonTableSchema = object().shape({
  baseMonth: object().shape({
    _id: string().required('Please select a month'),
    name: string().required(),
  }),
  targetMonth: object().shape({
    _id: string().required('Please select a month'),
    name: string().required(),
  }),
  matrix: object().shape({
    _id: string().required('Please select a matrix'),
    name: string().required(),
  }),
  // Other fields...
});

The problem is that when I submit the form, Yup validation only shows errors for one object field at a time. If I fulfill the criteria for one field and submit again, then it shows errors for the other field.

I've tried various approaches, including object().noUnknown() and defining custom validation functions, but neither approach resolves the issue.

How can I ensure that Yup validates both baseMonth targetMonth and matrix independently, considering they share the same object type, and shows errors for all fields simultaneously if they fail validation?

Getting jsonschema jsonschema.exceptions.SchemaError: Schema is not of type 'object', 'boolean',

I am writing a python program for validating json schema with sample JSON and I am using a package called jsonschema for JSON validation. When I run my code I am getting the error. It is showing that there is some is some issue in certain segment of JSON schema.But, I can't make out what exactly the error means "jsonschema.exceptions.SchemaError:[Schema] is not of type 'object', 'boolean' " Below is the Python code:-

 # import jsonSchema_validate
 # from jsonSchema_validate import validate
from jsonschema import validate
schema = {"$schema":"http://json-schema.org/drafts/2020-12/schema","type":"object","properties":{"responseCode":{"type":"integer"},"message":{"type":"string"},"returnValueType":{"type":"string"},"returnValue":{"type":"object","properties":{"configuration":{"type":"object","properties":{"adminLocked":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"maxNumberOfMACAddresses":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"port":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"qosProfile":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"operational":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}},"accessConnection":{"type":"object","properties":{"accessTechnology":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"linkUp":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"optical":{"type":"object","properties":{"upStream":{"type":"object","properties":{"waveLength":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"signalPower":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"inRange":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"bytes":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}},"downStream":{"type":"object","properties":{"waveLength":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"signalPower":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"inRange":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"bytes":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}}}},"lineError":{"type":"object","properties":{"upStream":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"downStream":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}}}}}},"terminationPoint":{"type":"object","properties":{"cpeType":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"serial":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"cpeUp":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"ethernet":{"type":"object","properties":{"linkUp":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"bitrate":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"fullDuplex":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}},"maxPortSpeed":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}},"upTime":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}},"rf":{"type":"object","properties":{"outputSignalStrength":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"filter":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"optical":{"type":"object","properties":{"downStream":{"type":"object","properties":{"waveLength":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"signalPower":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"inRange":{"type":"object","properties":{"value":{"type":"boolean"},"statusCode":{"type":"integer"}}}}}}}}},"vLans":{"type":"array","items":[{"type":"object","properties":{"cVlan":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"type":"integer"}}},"learnedMACAddresses":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"items":[{"properties":{"mac":{"type":"string"},"onPOI":{"properties":{"statusCode":{"type":"integer"},"value":{"type":"boolean"}},"type":"object"}},"type":"object"}],"type":"array"}}},"multicast":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"type":"boolean"}}},"multicastSnoop":{"properties":{"items":[{"type":"string"}],"statusCode":{"type":"integer"},"type":"array"}},"tagged":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"type":"boolean"}}},"tVlan":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"type":"integer"}}},"whsid":{"type":"object","properties":{"statusCode":{"type":"integer"},"value":{"type":"string"}}}}}]},"poi":{"type":"object","properties":{"id":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}},"sVlan":{"type":"object","properties":{"value":{"type":"integer"},"statusCode":{"type":"integer"}}},"fPoi":{"type":"object","properties":{"value":{"type":"string"},"statusCode":{"type":"integer"}}}}}}}}}

validate(instance = {
    "message": "7e",
    "responseCode": -558638,
    "returnValue": {
        "accessConnection": {
            "accessTechnology": {
                "statusCode": -798758,
                "value": "<\"\n!wrOJ,"
            },
            "lineError": {
                "downStream": {
                    "value": -738597
                }
            }
        },
        "configuration": {
            "adminLocked": {
                "statusCode": 195229,
                "value": "HM.7W0zhsZ"
            },
            "maxNumberOfMACAddresses": {
                "statusCode": 495669,
                "value": -800730
            },
            "operational": {},
            "port": {
                "statusCode": -672240
            },
            "qosProfile": {}
        },
        "poi": {
            "fPoi": {
                "value": "B}Hfm"
            },
            "id": {
                "statusCode": 260531,
                "value": "<G8|r"
            }
        },
        "rf": {
            "filter": {
                "statusCode": 251614,
                "value": "jYE"
            },
            "optical": {}
        },
        "terminationPoint": {
            "cpeType": {
                "statusCode": -896729,
                "value": "G'?\n"
            },
            "cpeUp": {
                "statusCode": 606944
            },
            "ethernet": {
                "bitrate": {
                    "statusCode": -318966
                },
                "linkUp": {},
                "maxPortSpeed": {}
            },
            "serial": {
                "statusCode": 365503,
                "value": "\rv?xkkb"
            },
            "upTime": {}
        },
        "vLans": [
        ]
    },
    "returnValueType": "&~857\rS"
}, schema = schema)
# true

Below is the error received :

   PS C:\Users\abc> & C:/Users/abc/AppData/Local/Programs/Python/Python37/python.exe "c:/abc backup/D DRIVE DATA/Python Codes/jsonSchema_validate.py"
Traceback (most recent call last):
  File "c:/abc backup/D DRIVE DATA/Python Codes/jsonSchema_validate.py", line 638, in <module>
    }, schema=schema)
  File "C:\Users\abc\AppData\Local\Programs\Python\Python37\lib\site-packages\jsonschema\validators.py", line 1117, in validate
    cls.check_schema(schema)
  File "C:\Users\abc\AppData\Local\Programs\Python\Python37\lib\site-packages\jsonschema\validators.py", line 231, in check_schema
    raise exceptions.SchemaError.create_from(error)
jsonschema.exceptions.SchemaError: [{'type': 'object', 'properties': {'cVlan': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'integer'}}}, 'learnedMACAddresses': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'items': [{'properties': {'mac': {'type': 'string'}, 'onPOI': {'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'boolean'}}, 'type': 'object'}}, 'type': 'object'}], 'type': 'array'}}}, 'multicast': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'boolean'}}}, 'multicastSnoop': {'properties': {'items': [{'type': 'string'}], 'statusCode': {'type': 'integer'}, 'type': 'array'}}, 'tagged': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'boolean'}}}, 'tVlan': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'integer'}}}, 'whsid': {'type': 'object', 'properties': {'statusCode': {'type': 'integer'}, 'value': {'type': 'string'}}}}}] is not of type 'object', 'boolean'


PS C:\Users\abc> 

Can you please let me know how to solve this error.

change the value of a key in existing schema json?

I have a plugin in WordPress, that generates html schema notation in application/ld+json format like this:

<script type="application/ld+json">{
    "@context": "http://schema.org/",
    "@type": "JobPosting",
    "datePosted": "2023-11-30T00:00:00+01:00",
    "title": "Web developer",
    "description": "something",
    "hiringOrganization": {
        "@type": "Organization",
        "name": ""
    },
    "identifier": {
        "@type": "PropertyValue",
        "name": "",
        "value": "TEST 123"
    },
    "jobLocation": {
        "@type": "Place",
        "address": "London, UK"
    },
    "validThrough": "2023-12-31"
}</script>

As you can see, the value for the name key for Organization in the hiringOrganization object is empty... so I need a way to correct this value.

I can't change the schema code directly ..so I need a way to do this by adding some HTML code after the existing schema.

Does anyone know a good way to do this? Can I change the value with a javascript? or by adding some extra schema ?

Appreciate the help!

Create a GUI from a XML schema automatically

I have to write a desktop application to edit data stored in a XML file. The format is defined by a XML schema file (.xsd). The format is quite complex.

Are there tools which can generate a basic GUI automatically? It's not yet decided which language to use. I have experience in Python and C++ using wxWidgets and C# (.NET 1) using Windows Forms.

Getting Error: Maximum call stack size exceeded while adding array of objects in mongodb using mongoose

This is how my mongoose model is for Question. And I have updated schema with array of testcases as shown in code.

const TestCaseSchema = mongoose.Schema({
  input: {
    type: String,
    required: true,
  },
  expectedOutput: {
    type: String,
    required: true,
  },
});
const QuestionSchema = mongoose.Schema({
  title: {
    type: String,
    required: true,
    unique: true,
    max: 110,
  },
  description: {
    type: String,
    required: true,
  },
  difficulty: {
    type: String,
    required: true,
  },
  createdat: { type: Date, default: Date.now() },
  testcases: [TestCaseSchema],
});

This is the payload I am sending in api request

const payload = {
...,
testcases:[
{
 input:'1 2 3 4',
 expectedOutput:'2 3 4 5'
},
{
 input:'1 2 3 4',
 expectedOutput:'2 3 4 5'
}
]
}

The question gets added in mongodb with testcases enter image description here

But on frontend when I fetch the list of question it is showing enter image description here

My frontend code looks like this

const payload = {
    title: "New Question ",
    description: "This is I am testing",
    difficulty: "Medium",
    testcases: [
      { input: "1 2 3 4", expectedOutput: "2 3 4 5" },
      { input: "1 2 3 4", expectedOutput: "2 3 4 5" },
    ],
  };
const handleClick = async () => {
    const result = await axios.post("/api/questions", payload);
    console.log(result);
  };

How to use component schemas for primitive types?

I want to create component schemas for single primitive key value pairs in order to reuse them. I found two ways, but neither works for my 2 requirements. The first solution nests the primitive value in an object with a single required key.

portComponent:
  type: object
  properties:
    port:
      type: integer
      minimum: 0
      maximum: 65535
  require:
    - port

The alternative is directly defining the primitive value like:

portComponent:
  type: integer
  minimum: 0
  maximum: 65535

My goals are

  1. Primitive values are always used with the same key. This reduces errors and code generators produce exactly one code class per key-value pair. Only #1 allows this
  2. I can specify if the primitive components are required. #1 only allows to require the whole component via allOf. But I can not specify them as optional. #2 values are added via a new key in the surrounding component which I can easily add to require. But then code generation doesn't work and keys are not enforced.

Sqoop - Use schema in saved job

When I run this command on shell works fine:

sqoop import --incremental append --check-column id_civilstatus --last-value -1 
--connect jdbc:postgresql://somehost/somedb --username someuser 
--password-file file:///passfile.txt --table sometable --direct -m 3 
--target-dir /jobs/somedir -- --schema someschema

But when I try to save it as a job:

sqoop job --create myjob -- import --incremental append --check-column id_civilstatus 
--last-value -1 --connect jdbc:postgresql://somehost/somedb --username someuser 
--password-file file:///passfile.txt --table sometable --direct -m 3 
--target-dir /jobs/somedir -- --schema someschema

Then I execute:

sqoop job --exec myjob

I get this error message:

PSQLException: ERROR: relation "sometable" does not exist

This is error due to 'sometable' does not exists in default schema.

Why sqoop job soes not take schema parameter? I am missing something?

Thanks

A way to set field validation attribute in pydantic

I have the following pydentic dataclass

@dataclass
class LocationPolygon:
    type: int
    coordinates: list[list[list[float]]]

this is taken from a json schema where the most inner array has maxItems=2, minItems=2.
I couldn't find a way to set a validation for this in pydantic.
setting this in the field is working only on the outer level of the list.

@dataclass
class LocationPolygon:
    type: int
    coordinates: list[list[list[float]]] = Field(maxItems=2, minItems=2)

using @validator and updating the field attribute doesn't help either as the value was already set and basic validations were already made:

@validator('coordinates')
    def coordinates_come_in_pair(cls, values, field):
        field.sub_fields[0].sub_fields[0].field_info.min_items = 2
        field.sub_fields[0].sub_fields[0].field_info.max_items = 2

I thought about using root_validator with pre=True, but there are only the raw values there.

Is there a way to tweak the field validation attributes or use pydantic basic rules to make that validation?

Schema is not validating "Itemprop"

This the code:

<a href="https://www.mim-essay.com" class="logo-imgContainer navbar-brand ml-3 d-flex align-items-center">
<img id="_logo6" itemprop="logo" fetchpriority="high" src="https://www.mim-essay.com/images/MiM-Essay-Sticky-Logo-6.png" alt="mim essay" class="logo-img"></a>

Why I am getting this error?

Error:- https://www.mim-essay.com/images/MiM-Essay-Sticky-Logo-6.png (The property logo is not recognised by the schema (e.g. schema.org) for an object of type WebPage.)

What should I do ?

I have tried and change <itemprop> from logo to different value but still getting the same error.

Snowflake load the data for Cadilla - ร‡ character as delimiter

I am trying to load the data into sample table with below ddl

create or replace table sample1(id varchar(1000), name varchar(200), city varchar(20));

File content

'1'ร‡'aaa'ร‡'a'
'2'ร‡'bbb'ร‡'b'

Doing manual load from UI

Used the ร‡ and \u00c7 as delimiter, But getting below error

Error on line 2, character 1, column 1 ("ID") Number of columns in file (1) does not match that of the corresponding table (3), use file format option error_on_column_count_mismatch=false to ignore this error

Extending Django model

I have two apps in my project, app1 and app2.
In app1 I have two models called Task and Job. I want to add new feature to these two models, I want to add category to them. But its more complicated then just a new field and app2 is mainly responsible for estimation of the category, so I created new model in the app2 as below:

class Category(models.Model):
    CAT1 = 1
    CAT2 = 2
    OTHER = 6
    UNKNOWN = 7
    CATEGORY_CHOICES = (
        (CAT1, "Cat1"),
        (CAT2, "Cat2"),
        (OTHER, "Other"),
        (UNKNOWN, "Unknown"),
    )
    auto_category = models.PositiveSmallIntegerField(choices=CATEGORY_CHOICES, default=UNKNOWN)
    user_category = models.PositiveSmallIntegerField(choices=CATEGORY_CHOICES, default=UNKNOWN)
    modify_timestamp = models.DateTimeField(blank=True, null=True)
    modified_by = models.ForeignKey("app1.User", on_delete=models.PROTECT, related_name='category_creator', blank=True, null=True)

Also I want every Task and Job object to have a default category as soon as it is created so I don't have to deal with None values on the frontend, "Unknown" value should be a default value so I created below function in app2/utils.py:

def create_default_category():
    new_category = Category.objects.create()
    return new_category.pk

And then I added OneToOneField relation to my Task and Job models in the app1 as well as default value:

class Task(models.Model):
    # Some already existing fields
    category = models.OneToOneField('app2.Category', on_delete=models.CASCADE, related_name='task_category', default=create_default_category)


class Job(models.Model):
    # Some already existing fields
    category = models.OneToOneField('app2.Category', on_delete=models.CASCADE, related_name='job_category', default=create_default_category)

But all of above doesn't seem to work as I expected.
When I tried to run migration of app1 the is an error saying:

(venv) C:\project>python manage.py migrate app1
Operations to perform:
  Apply all migrations: app1
Running migrations:
  Applying app1.0003_auto_20230828_1538...
Traceback (most recent call last):
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
psycopg2.errors.UniqueViolation: could not create unique index "app1_task_category_id_key"
DETAIL:  Key (category_id)=(1) is duplicated.


The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "manage.py", line 21, in <module>
    main()
  File "manage.py", line 17, in main
    execute_from_command_line(sys.argv)
  File "C:\project\venv\lib\site-packages\django\core\management\__init__.py", line 381, in execute_from_command_line
    utility.execute()
  File "C:\project\venv\lib\site-packages\django\core\management\__init__.py", line 375, in execute
    self.fetch_command(subcommand).run_from_argv(self.argv)
  File "C:\project\venv\lib\site-packages\django\core\management\base.py", line 323, in run_from_argv
    self.execute(*args, **cmd_options)
  File "C:\project\venv\lib\site-packages\django\core\management\base.py", line 364, in execute
    output = self.handle(*args, **options)
  File "C:\project\venv\lib\site-packages\django\core\management\base.py", line 83, in wrapped
    res = handle_func(*args, **kwargs)
  File "C:\project\venv\lib\site-packages\django\core\management\commands\migrate.py", line 232, in handle
    post_migrate_state = executor.migrate(
  File "C:\project\venv\lib\site-packages\django\db\migrations\executor.py", line 117, in migrate
    state = self._migrate_all_forwards(state, plan, full_plan, fake=fake, fake_initial=fake_initial)
  File "C:\project\venv\lib\site-packages\django\db\migrations\executor.py", line 147, in _migrate_all_forwards
    state = self.apply_migration(state, migration, fake=fake, fake_initial=fake_initial)
  File "C:\project\venv\lib\site-packages\django\db\migrations\executor.py", line 245, in apply_migration
    state = migration.apply(state, schema_editor)
  File "C:\project\venv\lib\site-packages\django\db\migrations\migration.py", line 124, in apply
    operation.database_forwards(self.app_label, schema_editor, old_state, project_state)
  File "C:\project\venv\lib\site-packages\django\db\migrations\operations\fields.py", line 110, in database_forwards
    schema_editor.add_field(
  File "C:\project\venv\lib\site-packages\django\db\backends\base\schema.py", line 447, in add_field
    self.execute(sql, params)
  File "C:\project\venv\lib\site-packages\django\db\backends\base\schema.py", line 137, in execute
    cursor.execute(sql, params)
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 99, in execute
    return super().execute(sql, params)
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 67, in execute
    return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 76, in _execute_with_wrappers
    return executor(sql, params, many, context)
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
  File "C:\project\venv\lib\site-packages\django\db\utils.py", line 89, in __exit__
    raise dj_exc_value.with_traceback(traceback) from exc_value
  File "C:\project\venv\lib\site-packages\django\db\backends\utils.py", line 84, in _execute
    return self.cursor.execute(sql, params)
django.db.utils.IntegrityError: could not create unique index "app1_task_category_id_key"
DETAIL:  Key (category_id)=(1) is duplicated.

There are a lot of already existing Task and Job model objects in my db, there is no object of Category model in the db, the table is literally empty.

Does anyone know what could potentially cause this problem? Or whats the proper solution for such such problem?

I already ran the migration for app2 and the table is normally created in the db, I am also able to manually create new Category objects in the db.
I tried to reset the indexes in the app2.Category table in the database but it didn't change anything.
I reverted and ran the migration again - it also didn't help.

I am also aware that I could create an abstract model class in the app1 and just inherit it in the Task and Job models - is it better solution?

I just thought that since Category is more connected with app2 it would be reasonable to put the code in the app2/models.py file.

Versioning Jsons with JSON Schema

I'm starting to write a JSON Schema for a category of parts that our industry uses. I have tried to look into how best to version JSON files for the future based on the schema but I couldn't find anything concrete as an implementation.

The easiest way to correlate this is to think of vehicles. We're defining a vehicles schema (vehicles-v1.0.0.json). Inside vehicles we have 4 different (at this stage) types of vehicle types - car, suv, truck, motorcycle). I think once the space industry improves, we may need to add several others (maybe say train, plane, taxi). We'd want to bump that to say version 2.0.

If the current JSON files written adhere to $id vehicles-v1.0.0, how do we ensure that the ones with train, plain & taxi are defined f or vechicles-v2.0.0?

I've had some thoughts and suggestions and examples but they all seem a bit half thought out. As far as I can tell, in our JSON file itself (say toyota-suv.json, bmw-suv.json that implements vehicles-v1.0.0.json) there's no link to the JSON Schema at all.

  1. Use a $schema property to the top of each JSON file that references the schema in use
  2. Use a _version object that has { $schema: <uri>, $version: "1.0.0" }
  3. Don't use anything in the -.json & instead let the parser deal with it.

What's the guidance on how it's been done thus far? How have you done versioning for JSON files with JSON Schema?

Also unfortunately our domain isn't as easy as Vehicles (Toyotas, BMWs, Audi's etc have small variations in specifications of the materials that they produce, so our schema has about 20 mandatory properties, and the manufacturers append others with an x-<company>-property field specific to their domain).

modify xml file python

I have an xml file that looks like this

filename: abc.nuspec

<?xml version="1.0" encoding="utf-8"?>
<package xmlns="http://schemas.microsoft.com/packaging/2013/05/nuspec.xsd">
  <metadata>
    <id>aa-bb-cc</id>
    <version>1.0.0</version>
    <authors>first last</authors>
    <requireLicenseAcceptance>false</requireLicenseAcceptance>
    <description>Google DialogFlow proto classes library</description>
    <dependencies>
      <group targetFramework=".NETStandard2.0">
        <dependency id="Google.Cloud.Dialogflow.V2" version="3.1.0" exclude="Build,Analyzers" />
        <dependency id="Google.Protobuf" version="3.15.6" exclude="Build,Analyzers" />
        <dependency id="Grpc.Core" version="2.36.1" exclude="Build,Analyzers" />
        <dependency id="Grpc.Core.Api" version="2.36.1" exclude="Build,Analyzers" />
        <dependency id="Grpc.Net.Client" version="2.36.0" exclude="Build,Analyzers" />
      </group>
    </dependencies>
  </metadata>
</package>

I would like to add below line if already not present in metadata field,

"< repository url="https://github.com/test/test-repo" type="get" />"

Here is the code i found

import xml.etree.ElementTree as ET

root = ET.parse(f'abc.nuspec').getroot()
path = root.find("metadata")
myattributes = {"url": "https://github.com/test/test-repo", "type": "get"}
new = ET.SubElement(path, 'repository', attrib=myattributes)
print(ET.tostring(root, short_empty_elements=False).decode())

but the above code didnt work. Am i doing something wrong here?

How to do schema-stitching and introspection in new @apollo/server and graphql-tools?

I'm working on a Node.js project having a main-service(A central component where various remote microservices which are exposed with respective "port/graphql" are stitched together to create a unified API endpoint. It acts as a gateway or aggregator, allowing clients to access multiple functionalities provided by different microservices through a single entry point.).

Now, I want to upgrade all the npm packages to latest versions. Currently in the main service I am using below packages and versions and moving to latest versions From TO graphql-tools V4.0.5 V9.0.0 apollo-server-express V2.7.2 @apollo/server V4.8.0 express V4.17.1 V4.18.2 graphql V14.4.2 V16.7.1

I'm able to stitch the schema of remote schemas using old packages and now i'm currently able to stitch and unable to resolve the API's using new versions.

I noticed that the introspectSchema and makeRemoteExecutableSchema functions are now deprecated.

I tried schema stitching using introspectionSchema with old version of packages and then again with new versions of packages:-

Old code:-

// Code snippet 1: Stitching remote schemas
`const createSchema = async () => {
  // ...
  for (let service of servicesList) {
    // ...
    retSchema = await utils.getRemoteSchema(
      url,
    );
    // ...
  }
  return mergeSchemas({
    schemas: remoteSchemas,
  });
};`

// Code snippet 2: Method to get remote schema and create executable schema

  `getRemoteSchema: async (url) => {
    const schemaHttpLink = new HttpLink({ uri, fetch });
    const httpLink = setContext((request, previousContext) => {
      return {
        headers: {
          // ...
        },
      };
    }).concat(schemaHttpLink);
    const schema = await introspectSchema(schemaHttpLink);
    const executableSchema = makeRemoteExecutableSchema({
      schema,
      link,
    });
    // ...
  }`

// Code snippet 3: Creating the ApolloServer instance
`const start = async () => {
  const schema = await createSchema();
  const server = new ApolloServer({
    schema,
    context: async ({ req, connection }) => {
      // ...
    },
  });
  // ...
};`

New Code:-

// Code snippet 1: Stitching remote schemas

import { buildHTTPExecutor } from "@graphql-tools/executor-http";
import { wrapSchema, schemaFromExecutor } from "@graphql-tools/wrap";
`const createSchema = async () => {
  // ...
  for (let service of servicesList) {
    // ...
    retSchema = await utils.getRemoteSchema(
      url,
    );
    // ...
  }
  return mergeSchemas({
    schemas: remoteSchemas,
  });
};`

// Code snippet 2: Method to get remote schema and create executable schema

  `getRemoteSchema: async (url) => {

   const remoteExecutor = buildHTTPExecutor({
      endpoint: uri,
    });

      let Subschema = {
      schema: await schemaFromExecutor(remoteExecutor),
      executor: remoteExecutor,
    };

    return Subschema;

  }`

// Code snippet 3: Creating the ApolloServer instance
` const schema = await createSchema();

  const server = new ApolloServer({
    schema,
    introspection: true,
  });

const app = express();

await server.start();

  app.use(
    `/graphql`,
    cors(),
    json(),
    expressMiddleware(server, {
    context: async ({ req, connection }) => {
    //Some manipulation of context //
    return context;
        }
        if (connection && connection.context) {
          return connection.context;
        }
     }`

With latest versions and new code i'm able to stitch schemas and able to receive introspectionQuery. But i receive 'null' when calling the APIs stitched in my main-service.

Could you please guide me on how to update my code to work with the latest versions of graphql-tools and apollo-server? Specifically, I need assistance with the replacement for introspectSchema and makeRemoteExecutableSchema, and any other changes necessary to successfully stitch together the remote schemas. Thank you in advance for your help in updating my schema stitching process with the latest graphql-tools and apollo-server versions!"

How to write data into snowflake database using python

I need to write data into snowflake database, I have the below script, but I want to write data without giving username and password, is there any other way? #NOTE - I have to run this script in databricks.

import json
import snowflake.connector

# Set up the Snowflake connection parameters
conn = snowflake.connector.connect(
    user='your_username',
    password='your_password',
    account='your_account_name',
    warehouse='your_warehouse'
)

# Create a cursor object to execute SQL statements
cursor = conn.cursor()


# Assuming `json_data` contains the fetched JSON data
parsed_data = json.loads(json_data)

# Iterate through the parsed data and insert it into Snowflake
for record in parsed_data:
    # Assuming your table has columns like `column1`, `column2`, etc.
    query = f"INSERT INTO your_table (column1, column2) VALUES ('{record['key1']}', '{record['key2']}')"
    cursor.execute(query)

# Commit the changes
conn.commit()

cursor.close()
conn.close()


Flatten is not exploding rows in snowflake

Input

Policy break_down
1234 "[{Business/Corporation 49 0.5145} {Corporate Formation 5 0.06999999999999999} {Taxation 1 0.0105} {Estate/Trust/Probate 45 0.5625}]"
234 null

Is the expected output 4 rows as below (since it has 4 values in the list):

Expected output

policy Description Pricing_Mod_Factor Pricing Mod Premium
1234. Business/Corporation 49 0.5145
1234. Corporate Formation 5 0.06999999999999999
1234. Taxation 10 0.0105
1234 Estate/Trust/Probate 45 0.5625

Query:

with temp as (
    select  
        b.id,  
        (replace(get(rater_response:rate_breakdown:coverage_output_list,0):factor_breakdown:
     per_attorney_premium:lookup_values ,'"','') ) as arr1
     --b.*,c.insurance_application_id
    from rater b
)
SELECT 
    id,
    flattened_data.value
     /*substring(replace(split(flattened_data.value, '}')[0],'[{',''),1, REGEXP_INSTR(replace(split(flattened_data.value, '}')[0],'[{',''),'\\d')-2) as col1, */
FROM temp,

Try to use "Virtual populate" with Mongoose 7 , but couldn't map any data

I use Mongoose.js in Express.js, and try to use "Virtual populate" function.
I have two collections, which are "Room" and "Order", and their definitions are like the following codes.

const RoomSchema = new mongoose.Schema(
  {
    title: {
      type: String,
      required: true,
    }
    roomNumbers: [
      {
        number: Number
      },
    ],
  }
);
export default mongoose.model("Room", RoomSchema);

const OrderSchema = new mongoose.Schema(
  {
    userId: { type: String, required: true }, 
    roomNumberId: [
      {
        type: mongoose.Types.ObjectId,
        ref: "Room",
      },
    ]   
  },
  {
    toObject: {
      virtuals: true,
    },
    toJSON: {
      virtuals: true,
    },
  }
);
OrderSchema.virtual("roomNumberList", {
  ref: "Room",
  localField: "roomNumberId", 
  foreignField: "roomNumbers",
});
export default mongoose.model("Order", OrderSchema);

My target is to use userId of Order collection, to populate the name of roomNumbers of Room collection. So, my accessing data process is:

export const getAllOrdersByUserId = async (req, res, next) => {
  const userId = req.params.id;
  try {
    const ordersList = await Order.find({ userId: userId })
      .populate("roomNumberList") 
      .exec();
    console.log(ordersList);
    res.status(200).json(ordersList);
  } catch (error) {
    next(errorMessage(500, "Server error", error));
  }
}

However, the json result is:

{
    _id: new ObjectId("64b70c60d6048e2881099496"),
    userId: '6478efd33cc47792f3ce6bc9',
    roomNumberId: [
      new ObjectId("6495c7caef3162cf3e2634bd"),
      new ObjectId("6495c7d3ef3162cf3e2634c2"),
      new ObjectId("6495c7d7ef3162cf3e2634c7")
    ],
    createdAt: 2023-07-18T22:04:16.178Z,
    updatedAt: 2023-07-18T22:04:16.178Z,
    __v: 0,
    roomNumberList: [],
    id: '64b70c60d6048e2881099496'
  }

The roomNumberList part of the result json is empty....
I spent days to search possible answers but could not find it.
Please help give some ideas. Thanks.

โŒ
โŒ