Normal view

There are new articles available, click to refresh the page.
Before yesterdayMain stream

How to conduct load testing to determine the maximum concurrent user capacity of a Node.js application on a single EC2 t4g.large instance

I have a Node.js application built with NestJS, hosted on an Amazon EC2 instance of type t4g.large, which has 2 cores and 8GB of RAM. My application utilizes Socket.IO with the Redis adapter for real-time communication.

I want to determine the maximum number of concurrent users my application can handle on a single EC2 instance before experiencing performance degradation. Ideally, I'd like to conduct load testing to simulate heavy traffic conditions and measure the application's response.

What are some recommended load testing tools and strategies to achieve this goal? Specifically, I'm looking for approaches that allow me to simulate a high volume of concurrent connections and measure the application's performance metrics under load.

Additionally, are there any specific considerations or optimizations I should make within my Node.js application or NestJS framework to maximize its capacity on a single EC2 instance?

Thank you for any advice or recommendations you can provide!

Links: Anne Hathaway, Rat Music, & More

Workspace with computer, journal, books, coffee, and glasses.Hey ho! Let’s go!

Though I scheduled this post a couple weeks ago, I’m an officially in South Korea by the time you’re reading this. I’m sure I’m having a great time.

We plan to do an overnight stay at a Buddhist temple and I’m most definitely going to load up on Korean beauty and skincare products. There’s also a chance some areas will still have cherry blossoms. Karaoke and ungodly amounts of Korean BBQ are on the agenda!

To all my fellow board gamers! There’s a game hitting Kickstarter in May that might be of interest to you: ApotheBakery. It looks very cute and I signed up to be notified when the campaign goes live. When it comes to board games, I definitely make purchase decisions based on aesthetics.

I went down an Anne Hathaway rabbit hole! Here is a segment of her watching some of her previous roles and I loved how seeing The Princess Diaries again makes her so emotional:

And of course, I was recommend even more Anne Hathaway content and proceeded to watch this conversation between Anne and Emily Blunt about their careers and friendship.

Malaraa in the SBTB Patreon Discord dropped a link of more Murderbot casting news! I know it’s hard to come back from the Alexander Skarsgard casting choice.

Enjoy this kitty being excited by the “rat music” of the Baldur’s Gate menu music. Do your pets react to any particular TV sounds?

Don’t forget to share what cool or interesting things you’ve seen, read, or listened to this week! And if you have anything you think we’d like to post on a future Wednesday Links, send it my way!

 

How to create a multi-user chatbot with langchain

Hope you are doing good. I’ve prepared a chatbot based on the below langchain documentation:

Langchain chatbot documentation

In the above langchain documenation, the prompt template has two input variables - history and human input.

I’ve variables for UserID, SessionID. I’m storing UserID, SessionID, UserMessage, LLM-Response in a csv file. I used python pandas module to read the csv and filtered the data frame for given UserID and SessionID and prepared the chat-history for that specific user session. I’m passing this chat-history as the ‘history’ input to the langchain prompt template(which was discussed in the above link). As I set verbose=true, the langchain was printing the prompt template on the console for every API call. I’ve started the conversation for the first user and first session and sent 3 human_inputs one by one. Later I started the second user session(now session ID and user ID are changed). After observing that prompt template on the console, I’ve observed that langchain is not only taking chat-history of second user session, it’s taking some of the chat-history from previous user session as well, even though I’ve written the correct code to prepare chat-history for the given user session. The code to get chat-history is below:

# get chat_history
def get_chat_history(user_id,session_id,user_query):
    chat_history = "You're a chatbot based on a large language model trained by OpenAI. The text followed by Human: will be user input and your response should be followed by AI: as shown below.\n"
    chat_data = pd.read_csv("DB.csv")
    for index in chat_data.index:
        if ((chat_data['user_id'][index] == user_id) and (chat_data['session_id'][index] == session_id)):
            chat_history += "Human: " + chat_data['user_query'][index] + "\n" + "AI: " + chat_data['gpt_response'][index] + "\n"
    chat_history += "Human: " + user_query + "\n" + "AI: "
    return chat_history

How to teach langchain to consider only the given user session chat-history in it’s prompt. Please help

Running Telegram bot made on python as background service on Ubuntu

Couldn't run telegram bot as background service. When run directly from terminal - it works fine.

I've made a simple telegram bot using telebot library:

import telebot

TOKEN = '7121288866:AAEh_QvG2nJshSziHSSV0HMIr2D0ksjB0Bk'
bot = telebot.TeleBot(TOKEN)

@bot.message_handler(content_types=['text'])
def proc_msg(message):
    chat_id = message.chat.id
    bot.send_message(chat_id,text="I'm test bot. Sorry, can't answer anything else. Made for dummy tests.")

bot.polling(none_stop=True, interval=0)

It works when I just run the script via terminal:

python test_bot.py

I need to run the bot as background service with automatic restart after reboots, so I made test_bot.service file:

[Unit]
Description=Test Bot
After=multi-user.target

[Service]
Type=simple
Restart=always
ExecStart=/usr/bin/python /etc/tg_bots/test4477qwe_bot/test_bot.py

[Install]
WantedBy=multi-user.target

Then put the file to /etc/systemd/system and run systemctl daemon-reload. At this stage everything is fine, my service is in list given by systemctl

But when I'm trying run the service using command systemctl start test_bot - nothing happens. It actually looks like systemctl start test_bot is hanging.

What could be wrong? (System is Ubuntu 22.04.3 LTS. Maybe it is not the best choice for running python-telegram-bots?)

Problem with SSL in Chrome (hostinger, letsencrypt)

I have this problem:

Failed to load resource: net::ERR_SSL_PROTOCOL_ERROR
There was an error! TypeError: Failed to fetch

I'm trying to make https POST or GET request to the server

This problems happens only in Chrome browser (Firefox, postman works ok)

I made a simple server:

const express = require('express');
const cors = require('cors');
const fs = require('fs');
const https = require('https');
const bodyParser = require('body-parser');

const app = express();

// CORS middleware to allow cross-origin requests
app.use(cors());

// Parse JSON request body
app.use(bodyParser.json());

app.post('/hello_world', (req, res) => {
    const { data } = req.body;
    // Your logic...
    res.status(200).send("Your response");
});

// SSL certificate
const options = {
    key: fs.readFileSync('/etc/letsencrypt/live/myhostname.com/privkey.pem', 'utf8'),
    cert: fs.readFileSync('/etc/letsencrypt/live/myhostname.com/cert.pem', 'utf8'),
    ca: fs.readFileSync('/etc/letsencrypt/live/myhostname.com/chain.pem', 'utf8')
};

const httpsServer = https.createServer(options, app);

// Listen on either port 443 for https production server
httpsServer.listen(443, () => {
    console.log('HTTPS Server running on port 443');
});

Request (frontend):

        const response = await fetch("https://myhostname.com/hello_world", {
            method: 'POST',
            headers: {
                'Content-Type': 'application/json',
            },
            body: JSON.stringify({data: "123"}),
        });
        const rebus = await response.json();

Certificates created with:

certbot certonly -d myhostname.com

What could be the problem and how to understand the reason?

I tried other browsers (Firefox works well and postman also)

error when running script for telegram bot

Traceback (most recent call last):
  File "C:\Users\USERPC\Desktop\Karasbomb\Vizione\main.py", line 1162, in <module>
    @dp.throttled(anti_flood,rate=3)
     ^^^^^^^^^^^^
AttributeError: 'Dispatcher' object has no attribute 'throttled'. Did you mean: 'throttle'?
Unclosed client session
client_session: <aiohttp.client.ClientSession object at 0x0000020D4CA6B530>

grammy framework telegram bot not sending message

im trying to creating a bot can search something with a api. when user click the inline buttons from bot the bot will send the result from api become a text. here is my code

const { Bot, InlineKeyboard, Api } = require('grammy');
const axios = require('axios');

const bot = new Bot("6682391479:AAEsxFepD08rgBwwTGEg0AzUyEUveDXyUlw");

//const quranApiUrl = axios.get('https://quran-api-id.vercel.app/surahs') //{
    const quranApiUrl = ('https://quran-api-id.vercel.app/surahs')
  //headers: //{
    //'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/120.0.0.0 Safari/537.36'
  //}
//})
//.then(response => {
  //quranApiUrl = response;
//})
//.catch(error => {
  //console.error(error);
//});



bot.command("start", (ctx) => {
    const inlineKeyboard = new InlineKeyboard();
    const infoButton = { text: 'Info', url: 'https://t.me/Ayakaninfo'};
    //const infobutton2 = { text: 'Search Surah', callback_data: 'info' };
    inlineKeyboard.row(infoButton);
    ctx.reply('Assalamualaikum!\n Hi nice to meet you! I am Ayakan. I can help you search for Suran. Select the button below to get started.', { reply_markup: inlineKeyboard });
    // ... rest of your code ...
})

bot.command('surah', async (ctx) => {
    try {
        const response = await axios.get(quranApiUrl);
        const surahList = response.data;
    
        // Pesan pertama dengan 64 tombol
        let inlineKeyboard = new InlineKeyboard();
        for (let i = 0; i < 64; i += 8) {
          const row = surahList.slice(i, i + 8).map(surah => {
            return { text: surah.number.toString(), callback_data: `surah_${surah.number}` };
          });
          inlineKeyboard.row(...row);
        }
        const infoButton = { text: 'Info', url: 'https://t.me/Ayakaninfo' };
        inlineKeyboard.row(infoButton);
        await ctx.reply(`Pilih surah:`, { reply_markup: inlineKeyboard });
    
        // Pesan kedua dengan 50 tombol
        inlineKeyboard = new InlineKeyboard();
        for (let i = 64; i < surahList.length; i += 8) {
          const row = surahList.slice(i, i + 8).map(surah => {
            return { text: surah.number.toString(), callback_data: `surah_${surah.number}` };
          });
          inlineKeyboard.row(...row);
        }
        inlineKeyboard.row(infoButton);
        await ctx.reply(`Pilih surah:`, { reply_markup: inlineKeyboard });
    
      } catch (error) {
        console.log(error)
        ctx.reply('Error retrieving surah list. Please try again later.');
      }
    });

//bot.callbackQuery(/surah_(\d+)/, async (ctx) => {
 // const surahNumber = ctx.match[1];

  //try {
    //const response = await axios.get(`${quranApiUrl}/${surahNumber}`);
    //const surahData = response.data;

    //await sendSurahInfo(ctx, surahData);

  //} catch (error) {
    //console.log(error)
    //ctx.reply('Error retrieving Quran data. Please try again later.');
  //}
//});

//async function sendSurahInfo(ctx, surahData) {
    //const surahTitle = surahData.name;
    //const surahDescription = surahData.description;
    //const surahRevelation = surahData.revelation;
  
    //const sendSurahButton = { text: 'Kirim Surah', callback_data: `send_surah_${surahData.number}` };
    //const infoButton = { text: 'Info', url: 'https://t.me/Ayakaninfo' };
    //const inlineKeyboard = new InlineKeyboard().add(sendSurahButton, infoButton);
  
    //await ctx.reply(`Surah ${surahTitle}\n\n${surahDescription}\n\nDiturunkan di: ${surahRevelation}\n\nSurah ke ${surahData.number}\n\nJumlah ayat: ${surahData.numberOfAyahs}`, {
      //reply_markup: inlineKeyboard,
    //});
  //}
  
  bot.callbackQuery(/surah_(\d+)/, async (ctx) => {
    console.log('CallbackQuery handler is triggered.');
    const surahNumber = ctx.match[1];
    console.log('Surah Number:', surahNumber);
  
    try {
      const response = await axios.get(`${quranApiUrl}/${surahNumber}`);
      const surahData = response.data;
      const surahTitle = surahData.name;
      const bismillah = surahData.bismillah.arab;
      const surahAyahs = surahData.ayahs.arab;
      const surahTranslation = surahData.ayahs.translation;
  
      const maxCharacters = 4096;
      const maxMessages = 1000; // Ubah sesuai kebutuhan, ini adalah jumlah maksimum pesan yang akan dikirimkan
  
      // Membagi ayat menjadi potongan-potongan
      const ayahChunks = chunkArray(surahAyahs, 5);
  
      // Menghitung jumlah pesan yang akan dikirim
      const totalMessages = Math.ceil(ayahChunks.length / maxCharacters);
  
      // Membuat teks pesan pertama
      const ayahTextFirstMessage = `Nama Surah: ${surahTitle}\n${bismillah}\n\n`;
  
      // Mengirim pesan pertama
      await ctx.api.sendMessage(ctx.chat.id, ayahTextFirstMessage, { parse_mode: 'HTML' });

      for (let i = 0; i < totalMessages; i++) {
        const startIdx = i * maxMessages;
        const endIdx = startIdx + maxMessages;
  
        // Mengambil potongan ayat untuk pesan saat ini
        const currentChunk = ayahChunks.slice(startIdx, endIdx);
  
        // Membuat teks pesan
        const ayahText = `${currentChunk.join('\n')}\n\n${surahTranslation}\n\n`;
  
        // Mengirim pesan
        await ctx.api.sendMessage(ctx.chat.id, ayahText, { parse_mode: 'HTML' });
      }
  
    } catch (error) {
      console.log(error);
      ctx.reply('Error retrieving Quran data. Please try again later.');
    }
});

  

function chunkArray(array, size) {
  if (!array) {
      return []; // or throw an error, depending on your requirements
  }

  const chunkedArray = [];
  for (let i = 0; i < array.length; i += size) {
      chunkedArray.push(array.slice(i, i + size));
  }
  return chunkedArray;
}

function sliceText(text, maxCharacters) {
    const slicedTexts = [];
    while (text.length > maxCharacters) {
        const sliceIndex = text.lastIndexOf('\n', maxCharacters);
        const slicedText = text.slice(0, sliceIndex);
        slicedTexts.push(slicedText);
        text = text.slice(sliceIndex + 1);
    }
    slicedTexts.push(text);
    return slicedTexts;
}

bot.start();

Then im trying to click the bot. but the bot only send the ayahTextFirstMessage

The ayahText is not sended anymore without any error from the console. the result api with the console is giving a result same like the code. i dont know where the problem because the bot not sending any error. can someone explain why and fix the code? thankyou. sry for my bad english :)

Sci-Fi, Contemporary Romance, & More

If you want more ways to find romances on sale, sign up for alerts for free books and sales from Red Feather Romance!

One Dance with a Duke

One Dance with a Duke by Tessa Dare is $1.99! This is the first book in Dare’s Stud Club trilogy and features a horse breeder hero. Readers say this one is slow to start with, but recommend powering through! There is a marriage of convenience aspect that reviewers really enjoyed.

In One Dance with a Duke—the first novel in Tessa Dare’s delightful new trilogy—secrets and scandals tempt the irresistible rogues of the Stud Club to gamble everything for love.

A handsome and reclusive horse breeder, Spencer Dumarque, the fourth Duke of Morland, is a member of the exclusive Stud Club, an organization so select it has only ten members—yet membership is attainable to anyone with luck. And Spencer has plenty of it, along with an obsession with a prize horse, a dark secret, and, now, a reputation as the dashing “Duke of Midnight.” Each evening he selects one lady for a breathtaking midnight waltz. But none of the women catch his interest, and nobody ever bests the duke—until Lady Amelia d’Orsay tries her luck.

In a moment of desperation, the unconventional beauty claims the duke’s dance and unwittingly steals his heart. When Amelia demands that Spencer forgive her scapegrace brother’s debts, she never imagines that her game of wits and words will lead to breathless passion and a steamy proposal. Still, Spencer is a man of mystery, perhaps connected to the shocking murder of the Stud Club’s founder. Will Amelia lose her heart in this reckless wager or win everlasting love?

Add to Goodreads To-Read List →

You can find ordering info for this book here.

 

 

 

No Judgments

No Judgments by Meg Cabot is $1.99! This is a recent release, so snap this one up if you’re curious. We had a great guest review of this one, where Esme recommends this to readers who aren’t so invested in Cabot’s backlist:

I think if you’re not as intimate (or obsessed) with Cabot’s backlist as I am, or you’re a dyed in the wool rom-com consumer, you’ll enjoy this. It’s very much in the vein of Tessa Bailey’s Fix Her Up.

The storm of the century is about to hit Little Bridge Island, Florida—and it’s sending waves crashing through Sabrina “Bree” Beckham’s love life…

When a massive hurricane severs all power and cell service to Little Bridge Island—as well as its connection to the mainland—twenty-five-year-old Bree Beckham isn’t worried . . . at first. She’s already escaped one storm—her emotionally abusive ex—so a hurricane seems like it will be a piece of cake.

But animal-loving Bree does become alarmed when she realizes how many islanders have been cut off from their beloved pets. Now it’s up to her to save as many of Little Bridge’s cats and dogs as she can . . . but to do so, she’s going to need help—help she has no choice but to accept from her boss’s sexy nephew, Drew Hartwell, the Mermaid Café’s most notorious heartbreaker.

But when Bree starts falling for Drew, just as Little Bridge’s power is restored and her penitent ex shows up, she has to ask herself if her island fling was only a result of the stormy weather, or if it could last during clear skies too.

Add to Goodreads To-Read List →

You can find ordering info for this book here.

 

 

 

Velocity Weapon

Velocity Weapon by Megan E. O’Keefe is $2.99! This is book one in a new space opera that focuses on siblings. I’ve heard good things about this one, but anything labeled “space opera” gives me pause since that isn’t my preferred sci-fi subgenre.

Dazzling space battles, intergalactic politics, and rogue AI collide in Velocity Weapon, the first book in this epic space opera by award-winning author Megan O’Keefe.

Sanda and Biran Greeve were siblings destined for greatness. A high-flying sergeant, Sanda has the skills to take down any enemy combatant. Biran is a savvy politician who aims to use his new political position to prevent conflict from escalating to total destruction.

However, on a routine maneuver, Sanda loses consciousness when her gunship is blown out of the sky. Instead of finding herself in friendly hands, she awakens 230 years later on a deserted enemy warship controlled by an AI who calls himself Bero. The war is lost. The star system is dead. Ada Prime and its rival Icarion have wiped each other from the universe.

Now, separated by time and space, Sanda and Biran must fight to put things right.

Add to Goodreads To-Read List →

You can find ordering info for this book here.

 

 

 

Boyfriend

Boyfriend by Sarina Bowen is 99c! This is a new adult hockey romance that I believe was mentioned in a previous Hide Your Wallet. It’s also the first book in the Moo U Hockey Romance series. Have you read this one? Tempted to buy it for myself.

A new hockey player to steal your heart this fall…

The dreamiest player on the Moo U hockey team hangs a flyer on the bulletin board, and I am spellbound:

Rent a boyfriend for the holiday. For $25, I will be your Thanksgiving date. I will talk hockey with your dad. I will bring your mother flowers. I will be polite, and wear a nicely ironed shirt…

Everyone knows it’s a bad idea to introduce your long-time crush to your messed-up family. But I really do need a date for Thanksgiving, even if I’m not willing to say why. So I tear his phone number off of that flyer… and accidentally entangle our star defenseman in a ruse that neither of us can easily unwind.

Who knew that Weston’s family was even nuttier than mine? He needs a date, too, for the most uncomfortable holiday engagement party ever thrown.

There will be hors d’oeuvre. There will be faked PDA. And there will be pro-level awkwardness…

Add to Goodreads To-Read List →

You can find ordering info for this book here.

 

 

 

Real-time chat streaming application using SignalR in C# and Angular

I am developing a chat application using C# and Angular. My controller in C# has a custom made Text Generative Algorithm which generates the response based on the user input from frontend. This response gets generated in chunks. I want to display the response on the frontend as the chunks are generated (also called as chat streaming, Streaming allows the model to generate and display text incrementally rather than waiting until the entire response is generated).

I tried using SignalR for real-time communication on the client and server side to display the chunks as in when they are generated. I created a new Hub Connection with URL "/MessageHub" and then started the Hub Connection. There are no errors building and starting the hub. Then I tried to send the user message from client side to server side using the invoke method of Hub Connection, but at this point, getting the error as Connection cannot be established.

Code in MessageHub Class in C#:

        public async Task SendMessage(string message, int requestId)
        {
            var response = _chatbotProvider.GetResponse(message, requestId);

            try
            {
                await Clients.All.SendAsync("ReceiveMessage", response);
            }
            catch (Exception ex)
            {
                Console.WriteLine(ex.Message);
            }
        }

Code in Angular to create and start the Hub Connection.

ngOnInit(): void {
    this._hubConnection = new HubConnectionBuilder().withUrl("/MessageHub").build();

    this._hubConnection.start().then(() => console.log('Connection Started!'))
    .catch(err => console.log('Error while establishing connection'));

  }

  sendMessage(){
    this.requestId = this.route.snapshot.params['requestId'];

    this._hubConnection.invoke('SendMessage', this.userMessage, this.requestId).catch(err => console.error(err));

    this.chatMessages.push({ role: 'user', content: this.userMessage });

    this._hubConnection.on("ReceiveMessage", (botResponse) => {
      this.chatMessages.push({ role: 'assistant', content: botResponse})
    });
    console.log(this.chatMessages)
    this.userMessage = '';
  }

Getting the error as below:

What I am expecting is the chat streaming functionality in my application (Just as the response is generated in ChatGPT, chunk by chunk).

Or is there any other method with which I can achieve the same?

How to create isolated session for ConversationBufferMemory per user in Langchain?

Problem Statement

I wish to create a FastAPI endpoint with isolated users sessions for my LLM, which is using ConversationBufferMemory. This memory will serve as context for conversation between the AI and the user. Currently, it's been shared with the AI and all users. I wish instead to isolate the memory per user.

I have the base implementation of the Langchain core library below.

Boilerplate Code

from langchain.memory import ConversationBufferMemory
from langchain.prompts import PromptTemplate
from langchain.chat_models import ChatOpenAI
from langchain.chains import ConversationChain

memory = ConversationBufferMemory(memory_key="chat_history", k=12)

async def interview_function(input_text):
    prompt = PromptTemplate(
        input_variables=["chat_history", "input"], template=interview_template)
    chat_model = ChatOpenAI(model_name="gpt-4-1106-preview", temperature = 0, 
                        openai_api_key = OPENAI_API_KEY, max_tokens=1000)
    llm_chain = ConversationChain(
        llm=chat_model,
        prompt=prompt,
        verbose=True,
        memory=memory,
    )
    
    return llm_chain.predict(input=input_text)

I made progress by subclassing the ConversationChain with the intention of passing custom memory keys, which are related to the user's unique id, from a separate data store, like a SQL table, which I use to reference the various users interacting with my LLM.

Subclassing Progress

def create_extended_conversation_chain(keys: List[str]):
    class ExtendedConversationChain(ConversationChain):
        input_key: List[str] = Field(keys)

        @property
        def input_keys(self) -> List[str]:
            """Override the input_keys property to return the new input_key list."""
            return self.input_key

        @root_validator(allow_reuse=True)
        def validate_prompt_input_variables(cls, values: Dict) -> Dict:
            """Validate that prompt input variables are consistent."""
            memory_keys = values["memory"].memory_variables
            input_key = values["input_key"]
            prompt_variables = values["prompt"].input_variables
            expected_keys = memory_keys + input_key
            
            if set(expected_keys) != set(prompt_variables):
                raise ValueError(
                    "Got unexpected prompt input variables. The prompt expects "
                    f"{prompt_variables}, but got {memory_keys} as inputs from "
                    f"memory, and {input_key} as the normal input keys."
                )
            return values
    return ExtendedConversationChain

However, I am stuck in creating this custom memory key. My memory keys seem to be not accessible after they have been defined at instantiation as I did in my boilerplate code section.

Is there a Langchain specific solution or do I need to create my own Cache and have my LLM interact with it ?

Bot nelayan asing ditahan dalam Op Naga Laut

KUALA TERENGGANU: Sebuah bot nelayan Vietnam yang membawa hasil laut, minyak diesel dan peralatan menangkap ikan bernilai hampir RM1.5 juta dirampas menerusi Op Naga Laut, kelmarin.

Menurut Pengarah Agensi Penguatkuasaan Maritim Malaysia Negeri Terengganu, Mohd. Khairulanuar Abd. Majid, empat kru turut ditahan dalam operasi pada jarak 133 batu nautika dari Kuala Terengganu itu.

Beliau berkata, kesemua mereka ditahan ketika sedang leka melakukan aktiviti menangkap ikan di perairan Malaysia sebelum menyedari kehadiran anggota penguatkuasaan.

Menurutnya, sebelum tangkapan itu, sepasukan anggota peronda mengesan dua buah bot disyaki nelayan warga asing yang dipercayai menceroboh perairan negara.

“Kedua-dua buah bot yang mencurigakan itu dikesan ketika rondaan Op Naga Barat dan pada awalnya kita mengesyaki mereka sedang menangkap ikan di perairan Malaysia tanpa kebenaran.

“Hasil pemeriksaan terhadap salah satu bot menemukan empat kru warga Vietnam termasuk tekong yang kemudian ditahan namun sebuah lagi bot berjaya melarikan diri ke arah perairan sempadan negara.

“Nilai rampasan termasuk bot, 3.5 tan hasil tangkapan, 2,000 liter minyak diesel dan peralatan menangkap ikan berjumlah keseluruhan hampir RM1.5 juta,” katanya ketika dihubungi di sini, hari ini.

Tambah beliau, empat orang nelayan Vietnam itu berusia antara 29 sehingga 61 tahun dan kes disiasat di bawah Akta Perikanan 1985 dan Akta Imigresen 1959/1963.

Tegas Mohd. Khairulanuar, pihaknya akan tetap mempergiatkan rondaan dan pengawasan di perairan negara meskipun dalam keadaan cuaca tidak menentu serta laut bergelora ketika ini.

Menurut beliau, aset Maritim Malaysia Negeri Terengganu beroperasi seperti biasa bagi memastikan kedaulatan undang-undang Malaysia tidak dicabul nelayan warga asing.

“Ini adalah tangkapan ke-37 sepanjang tahun ini dan Maritim Malaysia Negeri Terengganu tidak akan berhenti mengekang pencerobohan nelayan asing ke perairan negara,” katanya. – UTUSAN

The post Bot nelayan asing ditahan dalam Op Naga Laut appeared first on Utusan Malaysia.

Warga emas guna bot fiber dibeli tahun 1980-an

Oleh NOOR HAYATI MAMAT

HULU TERENGGANU: Seorang warga emas dekat sini masih menggunakan dua bot fiber dibelinya sekitar tahun 1980-an lalu bagi urusan berpindah apabila banjir melanda di Kampung Kepah di sini.

Ayub Deraman, 87, berkata, dua bot tersebut dibelinya dengan harga RM600 dan RM880 itu telah banyak berjasa pada dia sekeluarga memandangkan rumahnya sering dilanda banjir kerana terletak cuma beberapa meter dari sungai.

“Saya tinggal di sini sejak tahun 1964 dan hadapi banjir setiap kali musim tengkujuh ekoran limpahan Sungai Telemong dan Sungai Kepah. Rumah saya paling awal dinaiki air.

“Duit hasil bertani, saya kumpul hingga dapat beli bot fiber ini untuk bawa isteri, Zainab Mat Ali, 77, dan anak-anak ke pusat pemindahan sementara (PPS) ketika mereka masih tinggal bersama dulu.

“Bot fiber ini sudah beberapa kali bocor kerana terlanggar dahan – dahan pokok tetapi selagi dapat dibaiki, saya tetap akan gunakannya,” katanya ketika ditemui di rumahnya di Kampung Kepah, dekat sini, hari ini.

Tambah datuk 14 cucu ini, menjadi rutinnya akan melakukan persiapan seawal bulan Oktober untuk hadapi sebarang kemungkinan banjir termasuk mengemas barangan di rumah serta memeriksa bot fiber.

Katanya lagi, walaupun biasanya dia tidak ke PPS dan hanya bertahan di rumah namun bot fiber berkenaan kini masih digunakan untuk bawa isterinya berpindah ke PPS.

“Rumah saya sudah dua kali dinaiki air tengkujuh kali ini, terbaharu pada Jumaat lalu dan isteri berpindah ke PPS beberapa jam kerana air mulai surut selepas itu.

“Semalam air surut sepenuhnya di halaman rumah, jadi saya ambil peluang untuk memeriksa semula bot dan didapati ada bahagian telah bocor,” ujarnya. – UTUSAN

The post Warga emas guna bot fiber dibeli tahun 1980-an appeared first on Utusan Malaysia.

Chatterbot not setting the database name

I am trying to set the name for my database in chatterbot. Here is the code:-

from chatterbot import ChatBot
from chatterbot.trainers import ListTrainer
import os

def train_bot():
    chatbot = ChatBot('Bot',
    storage_adapter='chatterbot.storage.SQLStorageAdapter',
    databse="mydb",
    trainer='chatterbot.trainers.ListTrainer')
    for file in os.listdir('G:/Django Chatbot/SRC/chat_bot/data/'):
        convData = open(r'G:/Django Chatbot/SRC/chat_bot/data/' + file, encoding='UTF-8').readlines()
        chatbot.set_trainer(ListTrainer)
        chatbot.train(convData)
    print("Training completed")


train_bot()

I have referred to the documentation here. The sqlite3 database gets created with name 'db.sqlite3'. I want to change this name to be 'mydb.sqlite3' as specified in the code.

How do I have my Dialogflow ES bot initiate the welcome intent on any user input, not just greetings?

I'm trying to build a close-ended chatbot (for whatsapp) that provides customers with a menu of options at the initiation of conversation. The default welcome intent initiates upon receipt of a greeting as defined in training phrases and that's fine, but I'd like the bot to reply with the menu as long as these two conditions exist:

  1. It's a "new conversation" and
  2. any input is received from the customer, not just a greeting.

I tried to work with the Default Fallback Intent to handle this and I'm just not having any luck with it.

Can someone please suggest a solution that will allow the bot to respond at the start of conversation to any user input, not necessarily just a greeting as defined in the Default Welcome Intent?

AttributeError with django

I am trying to build an api that uses a chatbot that I built. I have the chatbot and other needed classes in a separate file called functions.py. In my views.py I import these classes and get this error:

Traceback (most recent call last):
  File "/home/dude/.local/lib/python3.10/site-packages/django/core/handlers/exception.py", line 55, in inner
    response = get_response(request)
  File "/home/dude/.local/lib/python3.10/site-packages/django/core/handlers/base.py", line 197, in _get_response
    response = wrapped_callback(request, *callback_args, **callback_kwargs)
  File "/home/dude/.local/lib/python3.10/site-packages/django/views/decorators/csrf.py", line 56, in wrapper_view
    return view_func(*args, **kwargs)
  File "/home/dude/.local/lib/python3.10/site-packages/django/views/generic/base.py", line 104, in view
    return self.dispatch(request, *args, **kwargs)
  File "/home/dude/.local/lib/python3.10/site-packages/rest_framework/views.py", line 509, in dispatch
    response = self.handle_exception(exc)
  File "/home/dude/.local/lib/python3.10/site-packages/rest_framework/views.py", line 469, in handle_exception
    self.raise_uncaught_exception(exc)
  File "/home/dude/.local/lib/python3.10/site-packages/rest_framework/views.py", line 480, in raise_uncaught_exception
    raise exc
  File "/home/dude/.local/lib/python3.10/site-packages/rest_framework/views.py", line 506, in dispatch
    response = handler(request, *args, **kwargs)
  File "/home/dude/.local/lib/python3.10/site-packages/rest_framework/decorators.py", line 50, in handler
    return func(*args, **kwargs)
  File "/home/dude/Desktop/Projects/Finance Webapp/Financial_Chatbot_API/chatbot/views.py", line 36, in chatbotRequestHandler
    chatbotObj = ChatBot()
  File "/home/dude/Desktop/Projects/Finance Webapp/Financial_Chatbot_API/chatbot/functions.py", line 104, in __init__
    self.vectorizer = load(file)
AttributeError: Can't get attribute 'custom_preprocessor' on <module '__main__' from '/home/dude/Desktop/Projects/Finance Webapp/Financial_Chatbot_API/manage.py'>

This is how i'm importing these classes:

from .functions import StockGrab, ChatBot, custom_preprocessor

I have tried adding the custom_preprocessor into the import but that did not work.

Any other Ideas?

REVIEW: System Collapse by Martha Wells

Image of a security unit/robot in a humanoid shape crouching in wait on an alien planet in a kind of jungleDear Martha Wells,

Murderbot is back!! Yay!!

System Collapse takes place immediately after the events of Network Effect (the previous book in the series, Fugitive Telemetry, covers events before Network Effect). Here’s a hint: a recap of Network Effect is very helpful when going into System Collapse if you’re anything like me and can’t remember much of what happened (there have been so many books filling my brain since then). In a pinch, this Wikipedia synopsis will help. I admit I was a bit lost at first (see previous). But, once I read that Wiki entry and a couple of Goodreads reviews of the earlier book, I felt oriented and that made this book much easier to follow.

Murderbot and ART and various of its humans have returned to the planet with the alien contamination from Network Effect and are trying to convince the colonists to leave and not sign up for indentured servitude with the Barish-Estranza Corporation. Murderbot is struggling because of redacted. Yes, that’s what it says in the text. For a good portion of the book, the SecUnit is not letting on what redacted is. We only know it has something to do with what happened before and it has made its humans (and ART) concerned about it and whatever happened has made has made Murderbot feel unreliable and a liability to the team.

SecUnit and a subset of it’s crew, together with ART-drone (it’s all ART but ART can split off into various different iterations) go to a remote area of the planet where there may be separatists who are unaware of what has been going on and who will be vulnerable to Barish-Estranza. While there, Murderbot is challenged to begin to face the trauma of its experience in Network Effect and what that means for it going forward. Plus, things get dicey with Barish-Estranza and Murderbot is called upon to use its knowledge of human behaviour (learned from all of its media-watching experience, particularly his beloved Sanctuary Moon) to protect an ever-increasing number of humans under its care. We also catch up with Three and other characters readers have come to know and like over the course of the series.

I adore the SecUnit and its reluctant but complete devotion to its humans and ART. I love the sarcasm and the way Murderbot refers to “I had an emotion” or “I made an expression” which both reminds me its not a human but also that its something very close to human. I love the way Murderbot shies away from emotions but feels them anyway, especially for people like Dr Mensah and Iris but also for its buddy, ART. I also love that those beloved human characters appreciate SecUnit for who it is, value its input and skill and treat it with respect. Murderbot’s dry humour is the best and it had me smiling and/or laughing out loud throughout the story.

I had an actual gun, one of ART’s projectile weapons, but we knew from experience how many shots it took to down an enraged ag-bot, and getting up right on its processor for a point-blank impact was not something anybody wanted me to try to attempt, especially me.

I feel a bit like Iris and Dr Mensah about SecUnit myself actually.

Murderbot ends the story in a good place – physically and mentally and ready for more adventures.

The best thing I can say about System Collapse is that it made me want to re-read (or re-listen) to the entire series again from start to finish and I’m planning to do just that soon – maybe over the Christmas break.

Definitely recommended (but remember what I said about the Network Effect recap). Grade: A

Regards,
Kaetrin

 

AmazonBNKoboGoogle

System Collapse by Martha Wells

System Collapse

A

System Collapse

by Martha Wells
November 14, 2023 · Tor.com
ParanormalRomance

TL;DR: The new Murderbot book, System Collapse, is very, very good and if you were a fan of the other books in the series you’ll likely enjoy this one just as much.

A small piece of framework for this review to better understand the praise that is about to erupt: I have been having a lot of trouble reading text.

This has been happening intermittently for a few months now, but has been much more acute since early October. No need for guesses as to why; I definitely know why.

I can listen, and audiobooks have been wonderful for me, but each time I try to read a book digitally or in print, I have a wretched time sinking into the story the way I usually do. I can submerge myself into an audiobook and it’s blissful, but the minute I try to read a book with my eyes, I bounce right off like it’s tucked inside a hard shell and I can’t crack the coating. (The pool cover is on! It’s not fair!)

This issue is particularly frustrating because I receive a lot of ARCs in digital galley form, and few audiobooks are available in advance of release date for a host of reasons – and to be clear, I am not complaining about having ARCs, or about the production time of an audiobook.

I’m complaining that my brain currently doesn’t like using my eyeballs to read and is refusing the input of text in any visual format. It’s unspeakably annoying.

On top of Brain Says No To Text Inputs, I’ve had a tiresome amount of insomnia where my brain cannot calm down. Last weekend, when I could not turn my brain off, I reached for the Murderbot series, and started System Collapse.

It was a good thing it was the night we changed the clocks back because not only did I deep-dive into the text, but I stayed up embarrassingly late reading. I didn’t gulp the book down, either, but read at a somewhat leisurely pace, following the story and nearly tearing up with relief that I was reading. I went to sleep, but kept reading, and finished the book the following day. I read, slept, had coffee, had lunch, and read. That has not happened in a dreadfully long time.

So when I say that this book broke not only a reading slump but a reading block, that’s what I mean. I’m far too tired to load the squee cannon with anything but cotton balls, so trust me when I say, squee incoming. A tired squee, but squee nonetheless.

So, anyway, a review of the book.

System Collapse starts shortly after the end of Network Effect, with Murderbot and ART, along with humans from Preservation and from the University of Mihira and New Tideland, trying to figure out how to help several colonies of humans who have been abandoned and are suffering from alien remnant contamination. In addition to Murderbot, and ART, there’s also a…

Show Spoiler

second rogue SecUnit, Three, who has a very small role in this book, but I imagine is going to inspire reams of fanfic.

When Murderbot and some of the humans learn that there’s a possible third group of colonists who have also been abandoned and refused contact with the other groups for forty years, they try to find and help them.

Show Spoiler

Part of ART’s crew’s activities under the guise of university research include forging contracts and charters for colonies and planets abandoned by corporations in order to help those colonists avoid being conscripted into unpaid labor for other corporations. The addition of a second or third group of colonists adds impossible complications to their plans to forge charters to help them all.

Adding to the complexity of the situation is Murderbot’s redacted commentary in the first third or so of the book. Murderbot, as usual, narrates the story from the first person, and parts are redacted, specifically parts about an event in the recent past and the resulting emotions and anxiety.

The redacted portions are later explained, but in the initial chapters, I wondered who exactly Murderbot was hiding information from: itself or us? Removing parts of its narrative or hiding them from itself doesn’t aid the situation inside Murderbot’s head. That situation is not great, and Murderbot has to keep going while inside its mental health and emotional stability are collapsing. Whenever someone asks how Murderbot is, the reply is “I’m fine. I’m fine, everything’s fine.”

Murderbot’s redacted account is one of many examples of how the story gently unpacks how our identity and our connection to others are found in the stories we tell one another and how honest our motivations are as we tell those stories. The themes from the prior books also continue to evolve and take on additional meaning, especially the importance of being recognized as a person as opposed to a tool or as an asset to be leveraged and exploited.

I think one reason this book was so easy for me to engage with is that it starts and then it GOES. Ever been on one of those roller coasters where you get in the little cab and sit there with your restraint on, and the folks there do a little safety briefing, check the restraints, and give some hand signals and everything is just chill, but then immediately out of the gate you drop into terrifying speeds and just GO, like the stillness before never was? This book is like that.

Murderbot is dealing with immediate problems and past problems, and the immediate ones are similar enough to the past ones that they exacerbate one another. It’s a bit of an adrenaline read. Everything that’s happening around Murderbot (and everything is indeed happening so freaking much) is heightening the chaos and instability inside Murderbot, and both sets of problems keep building speed. Murderbot is also trying really, really hard to not need help, and to not accept help, to be an island unto itself because surely that would hurt less. And of course, that is not working because that’s not how beings work: your system of one will collapse without the support of others.

It’s also interesting to me that in Murderbot’s world, there is lots and lots and I mean luxuriant, unimaginable amounts of mental health support. There are trauma protocols, and if the existing protocols don’t work, the humans will come up with newer, better ones, and there are whole med systems with mental health just built into the environment. It’s as if mental health is inextricably connected to physical health and should be treated as the same in importance and attention! Can you imagine?!

And yet, even with the ease of psychological and mental assistance, and the truly gobsmacking availability of mental healthcare in that world, the humans AND Murderbot are often just as resistant to using it! The more things change, the more they stay the same, I guess?

Show Spoiler

I also want to say that my absolute favorite part of this book was that the humans on the isolated splinter colony had their own media collection, which Murderbot and ART were rather excited about, and one of the programs is called Cruel Romance Personage.

Cruel. Romance. Personage. 

I cannot. I am unable to can, due to being entirely tickled by this title.

There had better be some fanart of the series title screen, is all I am saying here.

I could go on for another nine thousand words about all the ways this novel examines the elemental desire to tell a story, and the motivations behind selling a particular narrative. I’m still thinking about all the different ways that powerful storytelling and convincing arguments (or “argu-cussions”) are explored in this novel.

All of these people are dedicated to helping, to rescuing people in terrible circumstances, and I was both comforted and inspired by their determination to keep trying, even when things are decidedly Not Fine. My favorite parts of the past Murderbot books were all here: the sarcasm between the humans, Murderbot, ART, and other machine intelligences, and the absolute rush of the adventure plot coexisting alongside a thoughtful, nuanced examination of what it means to be a person, and how difficult it is to have emotions and feelings. The series as a whole and this book individually are about how important it is to recognize the autonomy of other beings, and for one’s own personhood to be recognized and acknowledged.

I’ll probably start over at book one, and read the series through again. I think this might be the seventh or eighth time? Either way, I have little to none objectivity about this series, and I loved this book.

❌
❌