โŒ

Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

In an Ember.js Octane Acceptance Test, how do I query and/or set the test window height?

I have an acceptance test to ensure that a PDF viewer on a specific route isnโ€™t accidentally sized incorrectly. Due to a convergence of prior choices and the way the embed tag works, the viewer can sometimes become truncated when seemingly unrelated elements are restyled. This is there to warn me that this has occurred, and it works.

EXCEPT, on some test machines, with really small displays, the test will fail. Trying to reproduce the problem, when running with the --server option, I can make the test fail by making the test window very narrow.

How do I control/test for this?

I donโ€™t see any method to either get the screen size or set it. The ember forum states that I should ask these kinds of questions here.

Thanks in advance!

The variance represents all the cumulative borders, margins, etc. between the outer element and the actual viewer. This remains consistent except when the display is very, very narrow and some elements need to wrap.

    await visit('/code-nav');
    await fillIn('[data-test-code-nav-selector]', 'PDF_I10_PCS_REFERENCE_MANUAL');

    let applicationYieldWrapper = find('[data-test-application-yield-wrapper]');
    let applicationYieldWrapperHeight = applicationYieldWrapper.offsetHeight;

    await waitFor('[data-test-pdf-viewer]');
    let pdfViewer = find('[data-test-pdf-viewer]')
    let pdfViewerHeight = pdfViewer.offsetHeight;
    
    let variance = 136; // borders, margins, etc.

    assert.equal(pdfViewerHeight + variance, applicationYieldWrapperHeight);

Pytest monkeypatch a multiprocess function for testing

Pytest monkeypatching a function that uses multiprocess (via concurrent.futures.ProcessPoolExecutor) does not work as expected. If the same function is written with a single process or multithread (via concurrent.futures.ThreadPoolExecutor), monkeypatch works as expected.

Why does multiprocess monkeypatch fail and how does one correctly monkeypatch a multiprocess function for testing?

Simplest code example to illustrate my question is below. In actual usage, I would try to monkeypatch a function imported from another module.

# file_a.py

import concurrent.futures as ccf

MY_CONSTANT = "hello"

def my_function():
    return MY_CONSTANT


def singleprocess_f():
    result = []
    for _ in range(3):
        result.append(my_function())
    return result


def multithread_f():
    result = []
    with ccf.ThreadPoolExecutor() as executor:
        futures = []
        for _ in range(3):
            future = executor.submit(my_function)
            futures.append(future)
        for future in ccf.as_completed(futures):
            result.append(future.result())
    return result


def multiprocess_f():
    result = []
    with ccf.ProcessPoolExecutor() as executor:
        futures = []
        for _ in range(3):
            future = executor.submit(my_function)
            futures.append(future)
        for future in ccf.as_completed(futures):
            result.append(future.result())
    return result

I expected all tests to pass:

# test_file_a.py

from file_a import multiprocess_f, multithread_f, singleprocess_f

# PASSES:
def test_singleprocess_f(monkeypatch):
    monkeypatch.setattr("file_a.MY_CONSTANT", "world")
    result = singleprocess_f()
    assert result == ["world"] * 3

# PASSES:
def test_multithread_f(monkeypatch):
    monkeypatch.setattr("file_a.MY_CONSTANT", "world")
    result = multithread_f()
    assert result == ["world"] * 3

# FAILS:
def test_multiprocess_f(monkeypatch):
    monkeypatch.setattr("file_a.MY_CONSTANT", "world")
    result = multiprocess_f()
    assert result == ["world"] * 3

how to auto convert time to timestamp

my backtesting code read UNIx timestamp, but time in the CSV file is in this format 2020-04-10 04:00:00,
it should be like this 1512045180000. what changes need to be done to be able to read it. I tried different ways but its hard for me.


def getEthereumData(start_date, end_date):
    print("Start reading ethereum data..")
    path = os.path.dirname(os.path.abspath(__file__))
    data = pd.read_csv(path + "\\data\\BTCUSDT-1m-data.csv")
    data['DateTime'] = [datetime.fromtimestamp(ts/1000) for ts in data["Timestamp"]]
    data = data.set_index('DateTime')
    data = data.dropna()
    data = data[parse(start_date): parse(end_date)]
    print("Reading complete")

    return data

How to prevent VSCode from automatically wrapping WebdriverIO and Mocha.js code in parentheses?

I am trying to write some tests using WebdriverIO and Mocha.js on VSCode, but when I write the code, VSCode automatically wraps the code with parentheses and it makes my code fail.

How can I close this feature?

When I write the code like this, it cannot find the element (because of the async mode I think.) I don't want these parentheses but VSCode automatically adds them

When I write the code like this, all the buttons are clicked without any problem. When I write the code like this, all the buttons are clicked without any problem.

Node.Js test mock or spy internal property

In my pet project i have few files to test. My employees.js controller is

let employees = [
  { id: '1', name: "Edmon D'antes", status: 'worker' },
  { id: '2', name: 'Alluri Sitarama Raju', status: 'hero' },
  { id: '3', name: 'Roman Shukhevych', status: 'hero' },
  { id: '4', name: 'Ptn Pnh', status: 'huylo' },
]

export const getAll = (req, res) => {
  res.status(200).json(employees)
}

export const create = (req, res) => {
  const newEmployee = {
    id: Date.now().toString(),
    ...req.body,
  }
  employees.push(newEmployee)
  res.status(201).json(newEmployee)
}

export const remove = (req, res) => {
  let message = 'Employee has been dismissed.'
  employees = employees.filter((e) => {
    if (e.id === req.params.id && e.status === 'hero') {
      message = 'You must respect heroes!'
      return true
    }
    return e.id !== req.params.id
  })
  res.json({ message })
}

My employees.test.js test is

import * as employeesModule from './employees.js'

let mockEmployees = [
  { id: '1', name: "Mocked Edmon D'antes", status: 'worker' },
  { id: '2', name: 'Mocked Alluri Sitarama Raju', status: 'hero' },
  { id: '3', name: 'Mocked Roman Shukhevych', status: 'hero' },
  { id: '4', name: 'Mocked Ptn Pnh', status: 'huylo' },
]

beforeEach(() => {
  jest.spyOn(employeesModule, 'getAll').mockImplementation((req, res) => {
    return res.status(200).json(mockEmployees)
  })
  jest.spyOn(employeesModule, 'create').mockImplementation((req, res) => {
    const newEmployee = {
      id: '5',
      ...req.body,
    }
    mockEmployees.push(newEmployee)
    return res.status(201).json(newEmployee)
  })
  jest.spyOn(employeesModule, 'remove').mockImplementation((req, res) => {
    let message = 'Employee has been dismissed.'
    mockEmployees = mockEmployees.filter((e) => {
      if (e.id === req.params.id && e.status === 'hero') {
        message = 'You must respect heroes!'
        return true
      }
      return e.id !== req.params.id
    })
    return res.json({ message })
  })
})

describe('employees controller', () => {
  it('getAll method should return all employees', () => {
    const req = {} // Provide an empty object for req
    const res = { status: jest.fn().mockReturnThis(), json: jest.fn() }
    employeesModule.getAll(req, res)
    expect(res.status).toHaveBeenCalledWith(200)
    expect(res.json).toHaveBeenCalledWith(mockEmployees)
  })

  it('create method should create a new employee', () => {
    const req = { body: { name: 'New Employee', status: 'working' } }
    const res = { status: jest.fn().mockReturnThis(), json: jest.fn() }
    employeesModule.create(req, res)
    const newEmployee = mockEmployees.find((e) => e.id === '5')
    expect(res.status).toHaveBeenCalledWith(201)
    expect(res.json).toHaveBeenCalledWith(newEmployee)
  })

  it('remove method should remove the employee if not a hero', () => {
    const req = { params: { id: '4' } }
    const res = { json: jest.fn() }
    employeesModule.remove(req, res)
    expect(res.json).toHaveBeenCalledWith({
      message: 'Employee has been dismissed.',
    })
  })

  it('remove method should not remove the employee if a hero', () => {
    const req = { params: { id: '2' } }
    const res = { json: jest.fn() }
    employeesModule.remove(req, res)
    expect(res.json).toHaveBeenCalledWith({
      message: 'You must respect heroes!',
    })
  })
})

As you can see I've covered with tests every controller exported method. And test works, but the issue is that my test verifies its own implementation version, not version provided by controller. To test version provided by controller I need access to employees collection spying or mocking the one. After that I'd have chance to check collection changes or calls. Is there ideas how to fix the test without changing the controller?

P.S.

I know how to make it working exporting employees collection from the employees.js, but you know that's improper change.

Why custom listener is called when I don't attach it to TestClass, not via annotation nor via the suite xml file? (TestNG)

Wondering why a customListener of mine is getting called even If I don't attach it to TestClass and not to xml file..?

So I've made a custom logger listener that implements ITestListener, IClassListener.
Now I attached this listener to 1 of the of my tests classes, and the second test class doesn't have a listener attached to it.
Now I've put both of the tests classes into, 1 suite xml file.

<!DOCTYPE suite SYSTEM "http://testng.org/testng-1.0.dtd">
<suite name="Suite A">
    <test name="FirstSuiteFirstClassTest" >
        <classes>
            <class name="ITClassTestA"/>
        </classes>
    </test>
    <test name="FirstSuiteSecondClassTest" >
        <classes>
            <class name="ITClassTestB"/>
        </classes>
    </test>
</suite>
@Listeners({LoggerListener.class})
public class ITClassTestA extends BaseTest {

    @Test
    public void test1InClassTestA() throws InterruptedException {
        logger.info("Started test 1 in ClassTest A: " + TimeUtils.nowUTC());
        Thread.sleep(TimeUnit.SECONDS.toMillis(4));
        logTheName("SDK");
        logger.info("Ended test 1 in ClassTest A: " + TimeUtils.nowUTC());
    }

    @Test
    public void test2InClassTestA() throws InterruptedException {
        logger.info("Started test 2 in ClassTest A: " + TimeUtils.nowUTC());

        Thread.sleep(TimeUnit.SECONDS.toMillis(4));

        logger.info("Ended test 2 in ClassTest A: " + TimeUtils.nowUTC());
    }
}
public class ITClassTestB extends BaseTest {

    @Test
    public void test1InClassTestB() throws InterruptedException {
        logger.info("Started test 1 in ClassTest B at: " + TimeUtils.nowUTC());
        Thread.sleep(TimeUnit.SECONDS.toMillis(4));
        logTheName("testNG");
        logger.info("Ended test 1 In ClassTest B at: " + TimeUtils.nowUTC());
    }

    @Test
    public void test2InClassTestB() throws InterruptedException {
        logger.info("Started test 2 In ClassTest B at: " + TimeUtils.nowUTC());

        Thread.sleep(TimeUnit.SECONDS.toMillis(4));

        logger.info("Ended test 2 In ClassTest B at: " + TimeUtils.nowUTC());
    }
}
public abstract class BaseTest {
    protected Logger logger;

    @BeforeMethod
    public void createLog() {

    }

    @BeforeClass
    public void beforeAll() {
        if (logger == null) {
            logger = LogManager.getLogger(this.getClass().getSimpleName());
        }
    }

    @AfterClass
    public void afterAll() {
        logger.info("Bye dude");
    }


    protected void logTheName(String name) {
        logger.info("The name is {}", name);
    }
}
public class LoggerListener implements ITestListener, IClassListener {
    private final Map<String, List<TestResultStatus>> testResultsStatusPerClass = new ConcurrentHashMap<>();

    private enum TestResultStatus {
        SUCCESSFUL, FAILED, TIMED_OUT, SKIPPED;
    }

    @Override
    public void onBeforeClass(ITestClass testClass) {
        Class<?> testRealClass = testClass.getRealClass();
        String testClassName = testRealClass.getSimpleName();

        initLogFile(testClassName);
    }

    @Override
    public void onAfterClass(ITestClass testClass) {
        List<TestResultStatus> resultStatuses = testResultsStatusPerClass.get(testClass.getRealClass().getSimpleName());

        Map<TestResultStatus, Long> summary = resultStatuses
                .stream()

         `Here I get NPE, because resultStatuses doesn't find the test class name in the map`
                

I didn't attach the rest of the code of `LoggerListener`..
But the main thing I try to understand is how come this listener is triggered while I didn't attach it to TestClass..

LR Analysis crashing 1/5 times - Looking for new solution - suggestions required

We are currently executing ~500 concurrent user load with ~1500 transactions using LoadRunner 2023 R1 version. However, we are experiencing issues with LoadRunner Analysis crashing to open the results 1/5 times. My assumption is that the default setting is not able to handle the data generated during the test run. The test duration is 4 hours and we are not storing the test results in any other database.

We have created a ticket to OpenText and are exploring other options to store the results. We are looking for your inputs on the best way to handle this situation and avoid it in the future.

One idea we propose is to store the results during the test run using an external database such as SQL Server. We would greatly appreciate any feedback and suggestions from the LoadRunner community.

Thank you for your time and expertise!

We tried giving some time, after the test run and before opening the LR Analysis engine.

How to mock environment variable that uses equality operator

Given the following code, I am unable to properly patch the os.environ "USE_THING". I believe this is because USE_THING is a constant created before the test runs, so I cannot reassign the value. If I define USE_THING within the do_thing method, I can patch it no problem, but I believe the instantiation with the equality operator at the top of the file is preventing me from doing so.

How should I best define USE_THING once in my codebase, to be used by multiple methods, but also allow me to patch the value in various unittests?

# kustomization.yaml
configMapGenerator:
  - name: my-service
    behavior: merge
    literals:
      - USE_THING=True
# thing.py

import os


USE_THING = os.environ.get("USE_THING", "False").upper() == "TRUE"

def do_thing():
    if not USE_THING:
        return
    try:
        doing_the_thing()
    except Exception as e:
        logger.error(e)

def do_another_thing():
    if not USE_THING:
        return
    try:
        doing_the_other_thing()
    except Exception as e:
        logger.error(e)
# test.py
class TestThing(TestCase):
    @mock.patch.dict(os.environ, {"USE_THING": "True"}, clear=True)
    @mock.patch("thing.do_thing")
    def test_do_thing():
        ...

Mocking AmazonS3 listObjects function in scala

Im trying to mock listObjects function in AmazonS3 interface, in order to return specific objects when running tests without calling s3.

The code I want to test is doing this logic:

val objects = s3Client.listObjects(
      new ListObjectsRequest()
        .withBucketName(bucketName)
        .withMaxKeys(2000)
    )

    val latestSchemaKey = objects.getObjectSummaries.toList
      .map(s => s.getKey)
      .sorted(Ordering[String].reverse)
      .head

I wanted to create a new ObjectListing and add objectSummaries to it, but seems like there are not setters for objectSummaries.

Is there something Im missing out? Thanks for helping!

Looked into ObjectListing class functions and noticed its not possible to set objects :)

test result: Error. Program `sh' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480002ms)

When performing regression testing on OpenJDK, the error "--Fail: waitForJdbMsg timed out after 40 seconds, looking for /]/, in 1 lines; exitting" occurred when testing a shell script with jdi-based debugging function through JTreg, and A timeout occurs, ultimately causing the test case to fail.

DeoptimizeWalk.sh๏ผˆshell test script๏ผ‰๏ผš

#!/bin/sh

#
# Copyright (c) 2002, Oracle and/or its affiliates. All rights reserved.
# DO NOT ALTER OR REMOVE COPYRIGHT NOTICES OR THIS FILE HEADER.
#
# This code is free software; you can redistribute it and/or modify it
# under the terms of the GNU General Public License version 2 only, as
# published by the Free Software Foundation.
#
# This code is distributed in the hope that it will be useful, but WITHOUT
# ANY WARRANTY; without even the implied warranty of MERCHANTABILITY or
# FITNESS FOR A PARTICULAR PURPOSE.  See the GNU General Public License
# version 2 for more details (a copy is included in the LICENSE file that
# accompanied this code).
#
# You should have received a copy of the GNU General Public License version
# 2 along with this work; if not, write to the Free Software Foundation,
# Inc., 51 Franklin St, Fifth Floor, Boston, MA 02110-1301 USA.
#
# Please contact Oracle, 500 Oracle Parkway, Redwood Shores, CA 94065 USA
# or visit www.oracle.com if you need additional information or have any
# questions.
#

#  @test
#  @bug 4525714
#  @summary jtreg test PopAsynchronousTest fails in build 85 with -Xcomp
#  @author Jim Holmlund/Swamy Venkataramanappa
#  @run shell DeoptimizeWalk.sh

#  This is another test of the same bug.  The bug occurs when trying
#  to walk the stack of a deoptimized thread.  We can do this
#  by running in -Xcomp mode and by doing a step which causes deopt,
#  and then a 'where'.  This will cause not all the frames to be shown.

compileOptions=-g

echo "*********** This test only fails with -Xcomp ***************"
createJavaFile()
{
    cat <<EOF > $1.java.1

public class $1 {
    static public void main(String[] args) {
       $1 mine = new $1();
       mine.a1(89);
    }

    public void a1(int p1) {
      int v1 = 89;
      System.out.println("a1" + v1);
      a2(89);
    }


    public void a2(int pp) {
      int v2 = 89;
      System.out.println("a2" + v2);
      a3(89);
    }

    public void a3(int pp) {
      int v3 = 89;
      System.out.println("a3");  //@ 1 breakpoint
      a4(22);                  // it passes if this line is commented out
      System.out.println("jj");
    }
    
    public void a4(int pp) {
      int v4 = 90;
      System.out.println("a4: @1 breakpoint here");
    }
}
EOF
}

# This is called to feed cmds to jdb.
dojdbCmds()
{
    setBkpts @1
    runToBkpt @1
    cmd where
    #cmd next
    cmd step
    cmd where
    cmd quit
    jdbFailIfNotPresent "shtest\.main" 3
}


mysetup()
{
    if [ -z "$TESTSRC" ] ; then
        TESTSRC=.
    fi

    for ii in . $TESTSRC $TESTSRC/.. ; do
        if [ -r "$ii/ShellScaffold.sh" ] ; then
            . $ii/ShellScaffold.sh 
            break
        fi
    done
}

# You could replace this next line with the contents
# of ShellScaffold.sh and this script will run just the same.
mysetup

runit
debuggeeFailIfPresent "Internal exception:"
pass


jtr file of jtreg๏ผš

#Test Results (version 2)
#Mon Mar 25 18:22:47 CST 2024
#-----testdescription-----
$file=/path/to/kunzejdk/PrimetonKunZe-8/jdk/test/com/sun/jdi/DeoptimizeWalk.sh
$root=/path/to/kunzejdk/PrimetonKunZe-8/jdk/test
author=Jim Holmlund/Swamy Venkataramanappa
keywords=bug4525714 shell
run=USER_SPECIFIED shell DeoptimizeWalk.sh\n
source=DeoptimizeWalk.sh
title=jtreg test PopAsynchronousTest fails in build 85 with -Xcomp

#-----environment-----

#-----testresult-----
description=file\:/path/to/kunzejdk/PrimetonKunZe-8/jdk/test/com/sun/jdi/DeoptimizeWalk.sh
elapsed=480003 0\:08\:00.003
end=Mon Mar 25 18\:22\:47 CST 2024
environment=regtest
execStatus=Error. Program `sh' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480002ms).
harnessLoaderMode=Classpath Loader
harnessVariety=Full Bundle
hostname=kylin-124
javatestOS=Linux 4.19.90-52.22.v2207.ky10.x86_64 (amd64)
javatestVersion=6.0-ea+b04-2024-03-25
jtregVersion=jtreg jtreg6 dev jtreg6
script=com.sun.javatest.regtest.exec.RegressionScript
sections=script_messages shell
start=Mon Mar 25 18\:14\:47 CST 2024
test=com/sun/jdi/DeoptimizeWalk.sh
testJDK=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/images/j2sdk-image
totalTime=480004
user.name=root
work=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/testoutput/jdk_tier1/JTwork/com/sun/jdi

#section:script_messages
----------messages:(4/255)----------
JDK under test: /path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/images/j2sdk-image
openjdk version "1.8.0_402"
OpenJDK Runtime Environment Kunze (build 1.8.0_402-12)
OpenJDK 64-Bit Server VM Kunze (build 25.402-b12, mixed mode)

#section:shell
----------messages:(3/125)----------
command: shell DeoptimizeWalk.sh
reason: User specified action: run shell DeoptimizeWalk.sh 
elapsed time (seconds): 480.003
----------System.out:(7/370)*----------
*********** This test only fails with -Xcomp ***************
--Compiling first version of /path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/testoutput/jdk_tier1/JTwork/classes/0/com/sun/jdi/aa850509/shtest.java with options: -g
compiling  shtest.java

--Starting jdb, address=
\u76d1\u542c\u5730\u5740: kylin-124:45527
Timeout refired 480 times
----------System.err:(87/5033)*----------
ShellScaffold.sh: Version
vv jdbOutFile  vvvvvvvvvvvvvvvvvvvvvvvvvvvv
\u76d1\u542c\u5730\u5740: kylin-124:45527
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-- jdb process info ----------------------
      850604 TTY -Dapplication.home=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/images/j2sdk-image -Xms8m -DHANGINGJAVA-850509_JDB
-- jdb threads: jstack 850604
2024-03-25 18:15:31
Full thread dump OpenJDK 64-Bit Server VM (25.402-b12 mixed mode):

"Attach Listener" #10 daemon prio=9 os_prio=0 cpu=0.44ms elapsed=0.20s tid=0x00007f5860001000 nid=0xd086a waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Service Thread" #9 daemon prio=9 os_prio=0 cpu=0.04ms elapsed=43.15s tid=0x00007f58e40e1800 nid=0xcfad0 runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C1 CompilerThread3" #8 daemon prio=9 os_prio=0 cpu=35.19ms elapsed=43.15s tid=0x00007f58e40de800 nid=0xcfacf waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread2" #7 daemon prio=9 os_prio=0 cpu=3.45ms elapsed=43.16s tid=0x00007f58e40dc800 nid=0xcface waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread1" #6 daemon prio=9 os_prio=0 cpu=3.85ms elapsed=43.16s tid=0x00007f58e40db000 nid=0xcfacd waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"C2 CompilerThread0" #5 daemon prio=9 os_prio=0 cpu=4.98ms elapsed=43.16s tid=0x00007f58e40d8000 nid=0xcfacc waiting on condition [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Signal Dispatcher" #4 daemon prio=9 os_prio=0 cpu=0.35ms elapsed=43.16s tid=0x00007f58e40d5000 nid=0xcfacb runnable [0x0000000000000000]
   java.lang.Thread.State: RUNNABLE

"Finalizer" #3 daemon prio=8 os_prio=0 cpu=0.36ms elapsed=43.18s tid=0x00007f58e40a2000 nid=0xcfaca in Object.wait() [0x00007f5889b69000]
   java.lang.Thread.State: WAITING (on object monitor)
    at java.lang.Object.wait(Native Method)
    - waiting on <0x0000000749481280> (a java.lang.ref.ReferenceQueue$Lock)
    at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:144)
    - locked <0x0000000749481280> (a java.lang.ref.ReferenceQueue$Lock)
    at java.lang.ref.ReferenceQueue.remove(ReferenceQueue.java:165)
    at java.lang.ref.Finalizer$FinalizerThread.run(Finalizer.java:188)

"Reference Handler" #2 daemon prio=10 os_prio=0 cpu=0.57ms elapsed=43.18s tid=0x00007f58e409d800 nid=0xcfac9 in Object.wait() [0x00007f5889c6a000]
   java.lang.Thread.State: WAITING (on object monitor)
    at java.lang.Object.wait(Native Method)
    - waiting on <0x0000000749481448> (a java.lang.ref.Reference$Lock)
    at java.lang.Object.wait(Object.java:502)
    at java.lang.ref.Reference.tryHandlePending(Reference.java:191)
    - locked <0x0000000749481448> (a java.lang.ref.Reference$Lock)
    at java.lang.ref.Reference$ReferenceHandler.run(Reference.java:153)

"main" #1 prio=5 os_prio=0 cpu=120.81ms elapsed=43.20s tid=0x00007f58e400b800 nid=0xcfab2 runnable [0x00007f58ea2eb000]
   java.lang.Thread.State: RUNNABLE
    at java.net.PlainSocketImpl.socketAccept(Native Method)
    at java.net.AbstractPlainSocketImpl.accept(AbstractPlainSocketImpl.java:409)
    at java.net.ServerSocket.implAccept(ServerSocket.java:560)
    at java.net.ServerSocket.accept(ServerSocket.java:528)
    at com.sun.tools.jdi.SocketTransportService.accept(SocketTransportService.java:348)
    at com.sun.tools.jdi.GenericListeningConnector.accept(GenericListeningConnector.java:151)
    at com.sun.tools.example.debug.tty.VMConnection.listenTarget(VMConnection.java:536)
    at com.sun.tools.example.debug.tty.VMConnection.open(VMConnection.java:330)
    - locked <0x00000007493a60d8> (a com.sun.tools.example.debug.tty.VMConnection)
    at com.sun.tools.example.debug.tty.Env.init(Env.java:63)
    at com.sun.tools.example.debug.tty.TTY.main(TTY.java:1083)

"VM Thread" os_prio=0 cpu=2.52ms elapsed=43.18s tid=0x00007f58e4095800 nid=0xcfac7 runnable 

"GC task thread#0 (ParallelGC)" os_prio=0 cpu=1.57ms elapsed=43.20s tid=0x00007f58e4021000 nid=0xcfab6 runnable 

"GC task thread#1 (ParallelGC)" os_prio=0 cpu=0.99ms elapsed=43.20s tid=0x00007f58e4023000 nid=0xcfab7 runnable 

"GC task thread#2 (ParallelGC)" os_prio=0 cpu=0.88ms elapsed=43.20s tid=0x00007f58e4025000 nid=0xcfab8 runnable 

"GC task thread#3 (ParallelGC)" os_prio=0 cpu=0.08ms elapsed=43.20s tid=0x00007f58e4026800 nid=0xcfab9 runnable 

"GC task thread#4 (ParallelGC)" os_prio=0 cpu=0.05ms elapsed=43.20s tid=0x00007f58e4028800 nid=0xcfaba runnable 

"GC task thread#5 (ParallelGC)" os_prio=0 cpu=0.06ms elapsed=43.20s tid=0x00007f58e402a000 nid=0xcfabb runnable 

"GC task thread#6 (ParallelGC)" os_prio=0 cpu=0.05ms elapsed=43.20s tid=0x00007f58e402c000 nid=0xcfabc runnable 

"GC task thread#7 (ParallelGC)" os_prio=0 cpu=0.08ms elapsed=43.20s tid=0x00007f58e402d800 nid=0xcfabd runnable 

"VM Periodic Task Thread" os_prio=0 cpu=17.44ms elapsed=43.15s tid=0x00007f58e40e4000 nid=0xcfad1 waiting on condition 

JNI global references: 12

------------------------------------------

----------rerun:(21/1139)*----------
cd /path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/testoutput/jdk_tier1/JTwork/scratch/2 && \\
HOME=/root \\
LANG=zh_CN.UTF-8 \\
PATH=/bin:/usr/bin:/usr/sbin \\
TESTSRC=/path/to/kunzejdk/PrimetonKunZe-8/jdk/test/com/sun/jdi \\
TESTSRCPATH=/path/to/kunzejdk/PrimetonKunZe-8/jdk/test/com/sun/jdi \\
TESTCLASSES=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/testoutput/jdk_tier1/JTwork/classes/0/com/sun/jdi \\
TESTCLASSPATH=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/testoutput/jdk_tier1/JTwork/classes/0/com/sun/jdi \\
COMPILEJAVA=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/images/j2sdk-image \\
TESTJAVA=/path/to/kunzejdk/PrimetonKunZe-8/build/linux-x86_64-normal-server-release/images/j2sdk-image \\
TESTVMOPTS='-ea -esa -Xmx512m' \\
TESTTOOLVMOPTS='-J-ea -J-esa -J-Xmx512m' \\
TESTJAVACOPTS= \\
TESTJAVAOPTS= \\
TESTTIMEOUTFACTOR=4.0 \\
TESTROOT=/path/to/kunzejdk/PrimetonKunZe-8/jdk/test \\
FS=/ \\
PS=: \\
NULL=/dev/null \\
    sh \\
        /path/to/kunzejdk/PrimetonKunZe-8/jdk/test/com/sun/jdi/DeoptimizeWalk.sh
result: Error. Program `sh' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480002ms).


test result: Error. Program `sh' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480002ms).

test result:

test result: Error. Program `sh' timed out (timeout set to 480000ms, elapsed time including timeout handling was 480002ms).

The problem has been solved. It was caused by the local language environment of the operating system. Set the local language environment to en_US.UTF-8 and it will be fine.

  • Check locale echo $LANG
  • Set locale export LANG=en_US.UTF-8

For unit test should I mock this function? (golang)

type CustomError struct {
    Code    int
    Message string
}

func (e CustomError) Error() string {
    return fmt.Sprintf("%d: %s", e.Code, e.Message)
}

func wrapError(err error) error {
    if errors.As(err, &CustomeError{}) {
       return errors.New("Unexcepted Error")
    }
    return err
}

func myUniteTestFunciton (error) {
   
   err := mockDoSometingReturnError()

   return wrapError(err)
}

For unit test should I mock wrapError method's return ? wrapError has his own unit test.

Ensuring Smooth Integration

When developing a website with integrated e-commerce functionality, how would you ensure seamless integration with third-party services such as an Amazon account management agency? Specifically, discuss the steps involved in securely connecting the website's backend with the agency's API to facilitate tasks like inventory management, order processing, and customer data synchronization. Additionally, address potential challenges related to data consistency, authentication protocols, and error handling in such an integration scenario.

Describe the steps you attempted to ensure seamless integration of third-party services like an Amazon account management agency with your website's e-commerce functionality. Explain your expectations regarding the process of securely connecting the backend with the agency's API for tasks such as inventory management, order processing, and customer data synchronization. Finally, detail any challenges you encountered related to data consistency, authentication protocols, and error handling during the integration process.

โŒ
โŒ