Thursday, January 28, 2021

Struggling to install Python in Windows 10

Today, I tried to install Python 3.9.1 on Windows 10. And wasted around 3 hours of time...

Let's not talk about what didn't work.

I am going to focus on how to get it running.

First is to download the latest version of Python from the Python website. Then, install it. But do a custom install. Install Python for all users (requires administrator rights) so that it will be install in your "Program Files" folder. Add it to the path too. There is the "Enable Win32 long paths" too, allow that (requires administrator rights too). That is part one.

Then, install the required tools.
python -m pip install --upgrade pip setuptools wheel
 
Install the virtualenv package so that you can set up virtual environments.
python -m pip install virtualenv

Finally, for EVERY project, set up its own virtual environment by going into the project folder, then:
virtualenv venv
This will create a virtual environment in a subfolder named venv. To activate the virtual environment, from the project folder:
venv\Scripts\activate

Once the virtual environment has been activated, you can install other requirements using pip, such as
pip install -r requirements.txt
or
pip install pysimplegui

I know this takes up a lot of space because you are duplicating the packages for each project, but I think it will help when you need to actually package your apps, because every project is more or less self-contained.
 
For VSCode, you can then use Ctrl+Shift+P to bring up the command palette, then Python: Select Interpreter to choose the interpreter for each project. You should choose the interpreter for the virtual environment of your project. To run the file, you should click on the "Play" button at the top right corner of the editor. Or see "Environments and Terminal windows" to set up the proper terminal.

Helpful links:

GUI for Google's text-to-speech service

In my course to overcome my dislike for GUI programming, I have been trying out different GUI frameworks. And the best way is to actually make something.

So I used the Google Text to Speech example in PySimpleGUI, and modified it. Here is the link to my repository for the gtts-gui.

 
You can now select the language and even speed up the speech a bit. The original speech playback speed can be a bit slow, so I added options for 1.25 and 1.5 times of the original speed. I have tried out various languages like Japanese, Chinese, Thai, Korean, and French. A simple test is to enter "1 2 3 4 5" for the text, then select the language you want. You can then hear playback of the numbers in different languages.


PySimpleGUI is declarative, like tkinter. To people who are used to object-oriented programming for GUI, this can be a significant difference. What is good about the declarative approach is that it is simple. A simple GUI can be quickly pieced together with a few lines of code. Is this better than an object-oriented framework, like Kivy? Actually, I think they are all just tools. You use the tool that best fits the job.

Update February 6, 2021: Added an improved version of the GUI that allows voice input using the SpeechRecognition Python package.



Thursday, January 21, 2021

Someday ft. Emarie (insert song for Cynthia's arc in Great Pretender)


“Someday ft. Emarie” ~シンシアのロンドン物語~
Composed & Arranged by Yutaka Yamada
Performed by YVY
--------------------------------------------------------------------
 
Was good to know you
But I'm not gonna
Pretend I'm all better
Thought we were forever

Hearts still in pieces
And your shadow follows me places, all the time
I can't shake this feeling of loneliness
That's just life I guess

But I believe, someday soon
I'll learn to overcome it
The scars will heal, heart will bloom
No more dwelling
'Cause today I choose to say
"I will love again someday"
So I'm not gonna stay
Paralyzed 'cause I'm not by your side

Time passes slowly
Makes me feel lonely
Gotta stop feeling sorry
I can't always feel worried

Mind won't stop spinning
And I keep on overthinking, all the time
I hope with time the healing will mend my soul
And I can let you go

But I believe, someday soon
I'll learn to overcome it
The scars will heal, heart will bloom
No more dwelling
'Cause today I choose to say
"I will love again someday"
So I'm not gonna stay
Paralyzed 'cause I'm not by your side
But I believe, that I will be, much better eventually
But I won't forget our love, ever

Time heals all, know that time heals all
I believe that someday soon
I'll remeber you with a smile
And overcome
And I will overcome

Someday soon I'll overcome
Someday soon I'll Over, over-come
I choose to say
"I will love again someday, again someday,"
I'll overcome

Lyric source: https://www.animesonglyrics.com/great-pretender/someday

Wednesday, January 20, 2021

Creating a font using FontForge

As part of the Tellsis language translator, I learnt to create my own font.
 

I used FontForge, and for each character, basically traced it out in FontForge.

First, I set the font type to stroked font with a stroke width of 1.

Then, I drew each glyph by using a picture of each character as a background and tracing it out.

Then, I changed the font type to outline font. And select all glyphs, then Element ->Expand Stroke so that the single line traces become outlines. (See this page for information.)

For spacing, I used the hint here.

Okay, by now, you should have guessed that this post is more a note to myself, so that I can remember what I did. 😅 Hopefully, it serves as a hint to others too.

P.S. Remember to create the "space" glyph too. All that is needed is to set the left and right spacing.

Tuesday, January 19, 2021

Platinum fountain pen sold at Daiso

I mentioned that I saw a cheap looking Platinum fountain pen being sold at Daiso. And since the Daiso fountain pens I tried worked so well for their price range, I thought I would give the Platinum fountain pen a try, even though it looks really cheap, because of its price. I mean, this is a 100-yen fountain pen.
 

It comes with a Platinum black ink cartridge, sealed with a ball bearing.


It writes great for its price. However, it took a while for the ink to reach the nib after being inserted, so you need to be patient.


The design of the pen also makes it "dangerous" when inserting a cartridge. It can be potentially messy, so I would advise wearing an apron when inserting and removing cartridges.


Still, it uses the common Platinum cartridge which means it is reusable, and can potentially be fitted with a converter too. Though the converter costs more than the pen itself. Like, five times more. 😅
 
The cap even has the Slip & Seal mechanism that Platinum is known for. At least, I think so, since I used a toothpick to push against the cap and could determine that it is spring-loaded inside.

In short, this is a great pen for its price. Yes, there are better fountain pens out there with smoother nibs and better ink flow, but they cost exponentially more. For 100 yen, this fountain pen writes way better than expected, making it a very cost-effective option for anyone still learning about fountain pens or just needing one every once in a while.

Sunday, January 17, 2021

Daiso fountain pens

Recently, I wrote about Platinum Preppy fountain pens, which are cheap fountain pens that write quite well for the price point. Well, surprise surprise... I found fountain pens on sale at Daiso for 100 yen each.

It kind of looks like the Platinum Preppy.


It even has the same spring loaded cap like the Platinum fountain pens that help to slow down drying of the nibs.
 
Surprisingly, they write well... I mean, they are 100 yen each. I didn't even think the ink would flow properly, but I managed to write with them!

Daiso even sells a Platinum fountain pen at 100 yen, but it looked so cheap that I didn't think it was worth the effort. Maybe I should get it just to try out.

By the way, in addition to light blue and pink, Daiso also sells these fountain pens in red and orange.

Friday, January 15, 2021

Tellsis language ("Nunkish") translator written in Python3

(Update: There is a version that uses Flutter for the UI and can run on Windows, Android, and Linux. See this post for more information and download location.)
 
Update 6 September 2022: I updated the script slightly as the PyPi version of google_trans_new is outdated. Instead of installing it via pip, a local copy is used instead. Also fixed the handling of names. Of course, the Flutter version (mentioned at the start of the post) is recommended since that is the one that I am more likely to work on and maintain.
 
I noticed a lot of views on my updated "Nunkish" script, but I kind of felt bad as that script was really written by someone else. I only updated it with an alternative Google Translate API library to allow it to work.

So I embarked on a "quest" to make my own version. A true bidirectional translator for the Tellsis language that can translate to and from Tellsis.(日本語ブログにもこのスクリプトについて書きました。)

And v0.1 is now ready, after a day of coding during my free time. The Tellsis language translator can be found at the Github repository here. (Update 28 August 2023: I just found official sources that state "Tellsis" and have made changes to the text used in this blog from "Tellsis" to "Tellsis" but the software will not be updated as it may impact others who have already downloaded the software.)

It works from the commandline.
$ ./telsistrans.py -t "I love Major \\Gilbert\\" -sl en
Nun posuk Gilbert ui gikapmarikon
$ ./telsistrans.py -t "Posuk \\Gilbert\\ nunki." -sl telsis
Thank you Major Gilbert.

It works in interactive mode.
$ ./telsistrans.py -i
Source language: en
Input source text: I love you
Target language:       
In Tamil script: நான் உன்னை நேசிக்கிறேன்
Pronunciation: Nāṉ uṉṉai nēcikkiṟēṉ
In unaccented characters: Nan unnai necikkiren
In target language: Nun annui noyirrikon
Source language: telsis
Input source text: Nunki posuk
Target language: ja
In Tamil script: நன்றி மேஜர்
Pronunciation:
In unaccented characters: Nanri mejar
In target language: ありがとう少佐


It works as a library.
from telsistrans import telsis_translator
translator = telsis_translator()
srctext = "I love you"
srclang = 'en'
translator.lang2telsis(srctext, srclang)
print(translator.results['tgt_text'])  # Print out results of translation
srctext = "Nunki posuk"
tgtlang = 'ja'
translator.telsis2lang(srctext, tgtlang)
print(translator.results['tgt_text'])  # Print out results of translation

 
Output of the above example:
Nun annui noyirrikon
ありがとう少佐
 
It can even output in the actual Tellsis alphabet if you supply a font file.


Other improvements include being able to use backslashes to tell the translator which are names that should not be processed by the substitution cipher. The script can handle the translation of phrases instead of individual words, but I have not tested it with full pages of text yet (may not work because of issues with handling punctuation).
 
A lot of details about how the Tellsis language (called Nunkish by fans) was decoded can be found in this Reddit post. During the production staff event at Shinjuku Picadilly Cinema, Suzuki Takaaki (who created this language) also talked about the process, but did not disclose the intermediate language used. We know that language is Tamil.

The substitution cipher is more or less what the Reddit post says. However, based on my tinkering, I have made the following changes to the mapping.
L <-> Q
J <-> S

The rest of the mapping:
A <-> U
C <-> Y 
E <-> O
G <-> V
H <-> T
K <-> R
M <-> P

The script requires Python3 to run, with the following libraries:
google_trans_new (Python library to use Google Translate)
unidecode (for converting to unaccented characters)
requests (for conversion to Tamil script)
Pillow (for rendering in Tellsis font)

Details can be found in the README.md of the repository, and explanation.md contains information about the conversion process and how the script works.

I have yet to do comprehensive tests on the translation results to make sure they are consistent with what can be found in the anime. If anyone is willing to do the testing, please report back on your findings in the comments here, or file an issue in the Github repository. Punctuation also seems to cause erratic behaviour, I am not sure why, but this could be due to the difference in punctuation between English and Tamil. Finally, my dream is to create a GUI for this using Tk, and maybe even an Android app using Kivy. But don't have high hopes... I really hate GUI programming so the GUI and Android app may never happen.
 
Please feel free to leave feedback in the comments. But please be civil and forgiving, this was, after all, a work which I put together in a couple of hours.

By the way, my review of the 2020 Violet Evergarden movie (VIOLET EVERGARDEN the Movie) can be found here.
 
Update January 20, 2021: I made a simple GUI using the PySimpleGUI framework.

I tried a Kivy version too, but the default theme is a bit dark and I still haven't figured out how to change the theme, so it will be shelved for a while.
 
 
 
I also created my own font file for the Tellsis language because I do not have the rights to distribute the font files that I found. Instead of trying to contact the authors to seek permission for distribution, I decided to learn how to create my own fonts using FontForge and managed to come up with something. It is VERY rudimentary but it serves the purpose of displaying the output in the Tellsis alphabet. After placing the font file in ~/.fonts directory, execute
sudo fc-cache -fv
to refresh the font cache.

Update 25 January 2021: I added a simple video to demonstrate use of the GUI.

Update 14 March 2021: I am working on converting the commandline version to Dart, with a future GUI in Flutter. So far, the commandline version in Dart seems to be working. I also managed to solve the issue with use of commas. However, the Dart commandline version won't be able to display the results in Tellsis font as Dart does not have a package like Python's PIL. Therefore, displaying results in Tellsis font will have to go through Flutter.

Update 30 March 2021: After trying to learn Dart and Flutter, and a lot of trial and error in getting the Flutter layout and such, I have a working app that can run in Linux and in an Android emulator. Multiple sentences work as long as everything is enclosed within double quotation marks.




Monday, January 11, 2021

A quiet day in Yokohama's Chinatown

Today is Coming of Age Day in Japan, a public holiday so that young people who turned 20 years of age (the legal age for being an adult in Japan) can attend the coming-of-age ceremonies held by municipalities across Japan. But things are very different this year due to COVID-19.

In Yokohama, a state of emergency has been declared by the government covering Tokyo, Kanagawa Prefecture (where Yokohama is), Saitama Prefecture, and Chiba Prefecture. We usually see a lot of young people going to attend these ceremonies or after they have attended the ceremonies. It is a time for them to get together with school friends, since many have since gone to different universities or started working.

Today, though, was a subdued day. Even the park outside Yokohama Stadium, which usually is full of people, was almost bare.


Chinatown, usually bustling on weekends and public holidays, had less than half the usual crowd on the "main" street.
 
 
A street behind that, there were even fewer people.


Chinese New Year is a month away. The state of emergency should be lifted by then. Hopefully.

PlatformIO project that can also be used in Arduino IDE

This is really a note to myself, since when I create Arduino sketches, I prefer to use PlatformIO. But the projects created in PlatformIO, even if using Arduino framework, does not comply with the Arduino's IDE requirement that the INO filename matches that of its containing folder.

To create such a PlatformIO project:
1. Create a new project in PlatformIO, selecting Arduino as the framework
2. Rename main.cpp inside the src folder to whatever you want the sketch to be called, for example, MySketch.ino
3. Rename the project's src folder to MySketch
4. Edit platformio.ini and add in
[platformio]
src_dir = MySketch
5. Put all other required project files (.h files and such) inside the MySketch folder.

Done!

Sunday, January 10, 2021

无题

自古男儿多薄情,新欢来时旧人忘。



Saturday, January 09, 2021

Display WiFi RSSI, time, and date on M5Stick C

I wanted to use an ESP32 to check my WiFi signal strength, and I thought of using the M5Stick C which I had lying around. It is based on the ESP32, and has a display too, making it a compact device for this simple need.
 
And I might as well add a clock function to show the time too, since there is some real estate on the display.
 
I found this Arduino sketch which displays the time on the M5Stick C. So I decided to adapt it for my needs, changing the display slightly to add in the RSSI for the connected SSID (which has been blurred out in the picture below).


On the original sketch, a short press of button A (large button with 'M5') turns on and off the LCD. A long press (2sec) of button B (small button at top, not shown on the picture above) forces resync with the NTP server.

My sketch shows the RSSI (in dBm) in a color code. Green for good, yellow for moderate, and red for poor. The RSSI for changing from good to moderate signal is set at -60 dBm and from moderate to poor at -70 dBm. I decided on these figures after looking at the table in this article.
 
Put configuration settings in a separate config.h file that contains
#define TIMEZONE     9
#define WIFI_SSID   "your_wifi_ssid"
#define WIFI_PASSWORD   "your_wifi_password"

where
TIMEZONE is the number of hours ahead of GMT (use negative number if behind GMT)
WIFI_SSID is SSID to connect to
WIFI_PASSWORD is password for SSID
Of course, you can also directly define these in the sketch itself. For the NTP server, you can change it to a server near your location, or just use pool.ntp.org which will usually return IP addresses for servers in or close to your country (but I have not really tried it, do leave a comment if it works for you).

Below is the Arduino sketch itself
-----------------------------------------------------
#include <Arduino.h>
#include <M5StickC.h>
#include <ESPmDNS.h>
#include <WiFi.h>
#include "time.h"
#include "config.h"
#include <ArduinoOTA.h>

// default hostname if not defined in config.h
#ifndef HOSTNAME
  #define HOSTNAME "m5stickc"
#endif

// use the WiFi settings in config.h file
char* ssid       = WIFI_SSID;
char* password   = WIFI_PASSWORD;

// define the NTP server to use
char* ntpServer =  "ntp.nict.jp";

// define what timezone you are in
int timeZone = TIMEZONE * 3600;

// delay workarround
int tcount = 0;

// LCD Status
bool LCD = true;

RTC_TimeTypeDef RTC_TimeStruct;
RTC_DateTypeDef RTC_DateStruct;

//delays stopping usualy everything using this workarround
bool timeToDo(int tbase) {
  tcount++;
  if (tcount == tbase) {
    tcount = 0;
    return true;    
  } else {
    return false;
  }  
}

// Syncing time from NTP Server
void timeSync() {
    M5.Lcd.setTextSize(1);
    Serial.println("Syncing Time");
    Serial.printf("Connecting to %s ", ssid);
    M5.Lcd.fillScreen(BLACK);
    M5.Lcd.setCursor(20, 15);
    M5.Lcd.println("connecting WiFi");
    WiFi.begin(ssid, password);
    while (WiFi.status() != WL_CONNECTED) {
      delay(500);
      Serial.print(".");
    }
    Serial.println(" CONNECTED");
    M5.Lcd.fillScreen(BLACK);
    M5.Lcd.setCursor(20, 15);
    M5.Lcd.println("Connected");
    // Set ntp time to local
    configTime(timeZone, 0, ntpServer);

    // Get local time
    struct tm timeInfo;
    if (getLocalTime(&timeInfo)) {
      // Set RTC time
      RTC_TimeTypeDef TimeStruct;
      TimeStruct.Hours   = timeInfo.tm_hour;
      TimeStruct.Minutes = timeInfo.tm_min;
      TimeStruct.Seconds = timeInfo.tm_sec;
      M5.Rtc.SetTime(&TimeStruct);

      RTC_DateTypeDef DateStruct;
      DateStruct.WeekDay = timeInfo.tm_wday;
      DateStruct.Month = timeInfo.tm_mon + 1;
      DateStruct.Date = timeInfo.tm_mday;
      DateStruct.Year = timeInfo.tm_year + 1900;
      M5.Rtc.SetData(&DateStruct);
      Serial.println("Time now matching NTP");
      M5.Lcd.fillScreen(BLACK);
      M5.Lcd.setCursor(20, 15);
      M5.Lcd.println("S Y N C");
      delay(500);
      M5.Lcd.fillScreen(BLACK);
    }
}

void buttons_code() {
  // Button A control the LCD (ON/OFF)
  if (M5.BtnA.wasPressed()) {
    if (LCD) {
      M5.Lcd.writecommand(ST7735_DISPOFF);
      M5.Axp.ScreenBreath(0);
      LCD = !LCD;
    } else {
      M5.Lcd.writecommand(ST7735_DISPON);
      M5.Axp.ScreenBreath(255);
      LCD = !LCD;
    }
  }
  // Button B doing a time resync if pressed for 2 sec
  if (M5.BtnB.pressedFor(2000)) {
    timeSync();
  }
}

// Printing WiFi RSSI and time to LCD
void doTime() {
  //if (timeToDo(1000)) {
    vTaskDelay(1000 / portTICK_PERIOD_MS);
    M5.Lcd.setCursor(10, 10);
    M5.Lcd.setTextSize(1);
    M5.Lcd.printf("%s: ", WiFi.SSID());
    long strength = WiFi.RSSI();
    if(strength < -70) M5.Lcd.setTextColor(RED, BLACK);
    else if(strength < -60) M5.Lcd.setTextColor(YELLOW, BLACK);
    else M5.Lcd.setTextColor(GREEN, BLACK);
    M5.Lcd.printf("%02d\n", strength);
    M5.Lcd.setTextSize(3);
    M5.Lcd.setTextColor(WHITE, BLACK);
    M5.Rtc.GetTime(&RTC_TimeStruct);
    M5.Rtc.GetData(&RTC_DateStruct);
    M5.Lcd.setCursor(10, 25);
    M5.Lcd.printf("%02d:%02d:%02d\n", RTC_TimeStruct.Hours, RTC_TimeStruct.Minutes, RTC_TimeStruct.Seconds);
    M5.Lcd.setCursor(15, 60);
    M5.Lcd.setTextSize(1);
    M5.Lcd.setTextColor(WHITE, BLACK);
    M5.Lcd.printf("Date: %04d-%02d-%02d\n", RTC_DateStruct.Year, RTC_DateStruct.Month, RTC_DateStruct.Date);
  //}
}

void setup() {
  M5.begin();

  M5.Lcd.setRotation(1);
  M5.Lcd.fillScreen(BLACK);

  M5.Lcd.setTextSize(1);
  M5.Lcd.setTextColor(WHITE,BLACK);
  timeSync(); //uncomment if you want to have a timesync everytime you turn device on (if no WIFI is avail mostly bad)

  // Port defaults to 3232
  // ArduinoOTA.setPort(3232);

  // Hostname defaults to esp3232-[MAC]
  ArduinoOTA.setHostname(HOSTNAME);

  // No authentication by default
  // ArduinoOTA.setPassword("admin");

  // Password can be set with it's md5 value as well
  // MD5(admin) = 21232f297a57a5a743894a0e4a801fc3
  // ArduinoOTA.setPasswordHash("21232f297a57a5a743894a0e4a801fc3");

  ArduinoOTA
    .onStart([]() {
      String type;
      if (ArduinoOTA.getCommand() == U_FLASH)
        type = "sketch";
      else // U_SPIFFS
        type = "filesystem";

      // NOTE: if updating SPIFFS this would be the place to unmount SPIFFS using SPIFFS.end()
      Serial.println("Start updating " + type);
    })
    .onEnd([]() {
      Serial.println("\nEnd");
    })
    .onProgress([](unsigned int progress, unsigned int total) {
      Serial.printf("Progress: %u%%\r", (progress / (total / 100)));
    })
    .onError([](ota_error_t error) {
      Serial.printf("Error[%u]: ", error);
      if (error == OTA_AUTH_ERROR) Serial.println("Auth Failed");
      else if (error == OTA_BEGIN_ERROR) Serial.println("Begin Failed");
      else if (error == OTA_CONNECT_ERROR) Serial.println("Connect Failed");
      else if (error == OTA_RECEIVE_ERROR) Serial.println("Receive Failed");
      else if (error == OTA_END_ERROR) Serial.println("End Failed");
    });

  ArduinoOTA.begin();
}

void loop() {
  M5.update();
  buttons_code();
  doTime();
  ArduinoOTA.handle();
}
------------------------------------------------------------
Update January 10, 2021: I noticed something weird today. I have the M5Stick C plugged into the front panel of my PC (Asrock B450 motherboard) for power, it seems to interfere with the BIOS boot process, which ended up taking like 1+ minute for the firmware and 1+ minute for the loader when I used systemd-analyze. Took me a while to troubleshoot this... Oh, I also ran sudo systemctl disable NetworkManager-wait-online.service and disabling this service allowed my PC to boot into GUI properly (it was a small issue in the past). 
Update January 12, 2021: I replaced the line
if (timeToDo(1000)) {
in void doTime() with
vTaskDelay(1000 / portTICK_PERIOD_MS);
instead. The updated version of this sketch also includes OTA and mDNS hostname too.

Update January 17, 2021: Created a GitHub repository for this project here.

Update January 19, 2021: The sketch has been updated to resync after a certain period of time. This latest sketch can be found on the GitHub repo.

Friday, January 08, 2021

Prevent .xsession-error file from growing too big

After playing around with HDMI capture on my Raspberry Pi 4, I ended up with a .xsession-error file that was 16GB in size! 😱 This prevented my system from working properly. I deleted the file and rebooted the system, which helped, but I wanted to prevent the same thing from happening again.

I found two main ways.

One is to redirect the output of X session errors to null. First, make a backup of the file to change by
sudo cp /etc/X11/Xsession /etc/X11/Xsession.bak
Then, edit /etc/X11/Xsession and change the line
exec >>"$ERRFILE" 2>&1
to
exec >> /dev/null 2>&1
It basically redirects all error logging for X sessions to null. I don't like this method since it means there is no proper log of any errors.

The second method is to periodically check the size of the .xsession-error file, then empty it when it becomes too big. This is done using cron. For example, using the command
crontab -e
Then, add in
*/15 * * * *  [ $(du -k /home/$(whoami)/.xsession-errors | awk '{ print $1 }') -gt 50000 ] && >/home/$(whoami)/.xsession-errors
What this does is to check the size of the .xsession-error file every 15 minutes. If it is bigger than 50,000 kilobytes, the file is emptied. You can change the file size as necessary.

du -k /home/$(whoami)/.xsession-errors gives the size of the file in kilobytes.
awk '{ print $1 }' prints out the first argument from the piped output of the previous, which is the size of the file.
-gt 50000 compares the result to see if it is greater than a number (50000) in this case, returning true if so. This will cause the final command to be execute.
>/home/$(whoami)/.xsession-errors empties the file.

Thursday, January 07, 2021

Seven-herb porridge 七草粥

There is a tradition on the seventh day of the first month to eat a porridge mixed with seven vegetables/herbs. The tradition came from China, and it is still practised in certain areas on the seventh day of the first month according to the lunar calendar. In Japan, ever since the lunar calendar was dropped, this seven-herb porridge (七草粥) is eaten on the seventh of January each year instead.

The seven vegetables/herbs are: 芹 (water dropwort), 薺 (shepherd's purse), 御形 (cudweed), 繁縷 (chickweed), 仏の座 (nipplewort), 菘 (turnip), and 蘿蔔 (radish).
 

I decided to observe this custom this year (it is a good change to eat healthy), and made my own seven-herb porridge using a pack of ingredients bought from the supermarket. After dipping the vegetables in boiling water (with a bit of salt) for a short while, I diced them. Then, I cooked porridge using water and rice (and barley). After about 30 minutes, I added the diced vegetables, added a bit of salt and soy sauce for flavour, and cooked until the vegetables were all soft.


The taste was quite bland, but a dash of sesame oil and a bit of white pepper helped to spice it up a bit.

(In Singapore and Malaysia, it seems this seven-herb porridge has been replaced by qicai yusheng (七彩鱼生, meaning "seven-coloured raw fish salad"). This may be due to many people of southern Chinese ancestry, such as the Teochews, who have a custom of eating fish on the seventh day of the first month.)

HDMI capture using Raspberry Pi 4

My Sony TV only has two HDMI inputs, but I want to connect my Raspberry Pi 4, Amazon Fire TV Stick, and Blu-ray player to it. So what can I do?

After some thought... I rarely use my Blu-ray player, so the two inputs should be used for the Raspberry Pi 4 and the Fire TV Stick. But I still want to use my Blu-ray player once in a while, which means I have to find some solution. It was around this time that I came upon a YouTube video about cheap HDMI capture devices. So I thought, why not give it a try?

The concept is to connect the HDMI output of the Blu-ray player to a cheap HDMI capture device, plug that capture device into the Raspberry Pi 4 (which is connected to the TV via HDMI), and use VLC to play the input from the Blu-ray player (captured via the capture device).

Basically:
1. Raspberry Pi 4's video output is connected to the TV via HDMI.
2. Blu-ray's player video output is connected to capture device via HDMI.
3. Capture device is plugged into Raspberry Pi 4 via USB.
4. VLC is used to show the input captured by the capture device.

I got this USB HDMI capture device on Amazon Japan which was having a New Year sale. In addition to the sale's discounted price, there was a further 20% discount coupon, which made it even cheaper (around 1,500 yen after all discounts). Besides the discount, the other reason for choosing this model is because it has the USB connector on a cable, which means I don't have to find a separate USB extension cable since I don't want to have a device sticking out from the Raspberry Pi 4.
 

dmesg gives the following:
usb 1-1.2: new high-speed USB device number 5 using xhci_hcd
usb 1-1.2: New USB device found, idVendor=534d, idProduct=2109, bcdDevice=21.00
usb 1-1.2: New USB device strings: Mfr=1, Product=2, SerialNumber=0
usb 1-1.2: Product: USB3. 0 capture
usb 1-1.2: Manufacturer: MACROSILICON
hid-generic 0003:534D:2109.0005: hiddev97,hidraw4: USB HID v1.10 Device [MACROSILICON USB3. 0 capture] on usb-0000:01:00.0-1.2/input4
uvcvideo: Found UVC 1.00 device USB3. 0 capture (534d:2109)
usbcore: registered new interface driver uvcvideo
USB Video Class driver (1.1.1)
usbcore: registered new interface driver snd-usb-audio

v4l2-ctl -d /dev/video0 --list-formats-ext
gives a list of video formats supported. The chip being used should be a MacroSilicon MS2109.

ioctl: VIDIOC_ENUM_FMT
    Type: Video Capture

    [0]: 'MJPG' (Motion-JPEG, compressed)
        Size: Discrete 1920x1080
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1600x1200
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1360x768
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1280x1024
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1280x960
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1280x720
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1024x768
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 800x600
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 720x576
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 720x480
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 640x480
            Interval: Discrete 0.017s (60.000 fps)
            Interval: Discrete 0.020s (50.000 fps)
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
    [1]: 'YUYV' (YUYV 4:2:2)
        Size: Discrete 1920x1080
            Interval: Discrete 0.200s (5.000 fps)
        Size: Discrete 1600x1200
            Interval: Discrete 0.200s (5.000 fps)
        Size: Discrete 1360x768
            Interval: Discrete 0.125s (8.000 fps)
        Size: Discrete 1280x1024
            Interval: Discrete 0.125s (8.000 fps)
        Size: Discrete 1280x960
            Interval: Discrete 0.125s (8.000 fps)
        Size: Discrete 1280x720
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 1024x768
            Interval: Discrete 0.100s (10.000 fps)
        Size: Discrete 800x600
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
            Interval: Discrete 0.200s (5.000 fps)
        Size: Discrete 720x576
            Interval: Discrete 0.040s (25.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
            Interval: Discrete 0.200s (5.000 fps)
        Size: Discrete 720x480
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
            Interval: Discrete 0.200s (5.000 fps)
        Size: Discrete 640x480
            Interval: Discrete 0.033s (30.000 fps)
            Interval: Discrete 0.050s (20.000 fps)
            Interval: Discrete 0.100s (10.000 fps)
            Interval: Discrete 0.200s (5.000 fps)


Based on the output above, it is supposed to be USB 3.0, and capable of 1080p capture at 60 fps. Supposedly. But when I tried 60 fps, it did not work. Even 1080p at 30 fps was giving me problems.I experienced stutter during playback with buffering issues. I suspect it is because the capture device is not able to stream out the data fast enough for VLC to display at 30 fps. Anyway, I settled with 1080p at 25 fps, which seems to allow stable playback without stutter.

Anyway, in the end, I used the following command to display the captured input using VLC.
cvlc v4l2:///dev/video0 :v4l2-standard= :input-slave=alsa://hw:2,0 :v4l2-chroma=MJPG :v4l2-width=1920 :v4l2-height=1080 :v4l2-aspect-ratio=16\:9 :v4l2-fps=25 :live-caching=300

What it does is:
- Use /dev/video0 which is the identifier for this capture device. Depending on what else is connected, it can be some other number, like video1, video2, etc. 
- Audio device is hw:2,0 but again, this may be different. I got this from running VLC and selecting "Open capture device..."
- Chroma is MJPG; otherwise, it defaults to YUYV which is uncompressed and can only give 5 fps at 1080p.
- Video resolution is 1920 by 1080 (aka 1080p) with aspect ratio of 16:9.
- FPS set at 25 fps
- Live caching is set at 300ms which seems to provide enough of a buffer for smooth playback.

In TwisterOS, I created an app launcher that runs this command, so next time, all I have to do is click on the app launcher icon to call up VLC with these settings.

Based on an article that I read (see this and this), it seems the color may be a bit off, but can be adjusted. The following are settings which I want to try but have not done so.
v4l2-ctl -d /dev/video0 --set-ctrl=brightness=-9
v4l2-ctl -d /dev/video0 --set-ctrl=contrast=148
v4l2-ctl -d /dev/video0 --set-ctrl=saturation=127
 
So now, I am able to watch Blu-ray/DVDs again! 😄

Update January 9, 2021: If anyone is having trouble with video playback on Raspberry Pi OS, such as the screen hanging, do a sudo apt update and sudo apt full-upgrade to fix it. A recent update had a bug, but it will quickly fixed once the issue was raised on the forums.

Tuesday, January 05, 2021

Nice quotes from Violet Evergarden the Movie (11th viewing)

It is the start of a new year, and so I went to catch Violet Evergarden the Movie again. First time for 2021. Total running count would make this the eleventh time watching the movie. 😅
 
I caught the show in the morning. They were still handing out the postcards at the Dolby Cinema that I went to. Plus the plastic folder, which was handed out a while ago.
 


Anyway, for this post, I will be focusing on quotes from the movie that I like. Plus some miscellaneous findings. As usual, I was 😭 throughout the movie.

The quotes:
 
「強く願うと思いはかなうものだな」
「強く願ってもかなわない思いはどうすればよいのでしょうか」
"Wishes come true if one hopes for them strong enough."
"What should one do if one's wish will not come true no matter how strong one hopes?"
 
「忘れるは難しい。。。生きている限り。。。忘れることは、できません」
"Forgetting is difficult... As long as I am alive... Forgetting is... impossible."
 
「愛してるを知ったから、愛してるを伝えたいと思いました」
"Because I understand what 'I love you' means, I want to convey 'I love you'."
 
「言葉にも態度にも気持ちにも表と裏があって、目に見えるものがすべてではないのだと少しずつ分かってまりました。」
"I have slowly come to understand that there are hidden meanings behind words, attitudes, and feelings. Not everything can be seen."
 
「本当の気持ちは、伝えなければわからない場合も多いです。」
"There are many times when one's true feelings cannot be understood if they are not conveyed explicitly."

「伝えたいことはできる間に伝えておく方がよいと思います。」
"It is best to convey what you want to convey when you can."

「言葉で言えなくても手紙ならできるかも。」
"Even if the words are hard to say out loud, maybe letters can be used to convey them."

「伝えたいあの人は、今、この時しかいないから。」
"Because the persons that I want to express my feelings to will only be around for now."
 
After the quotes, these are some other details I noticed in the movie.
 
Hodgins has a rabbit and cat on his desk. These were the stuff toys (together with the puppy) which he wanted to give Violet in episode 1 of the TV anime series. Violet chose the puppy, so I guess the rabbit and cat got to stay with Hodgins.
 
Violet visited Yuris three times. The first time when he called the postal company and she had to hide under the bed. He had orange roses in this room during this visit. The second time when they talked about how to write the letters, which was when he had white roses in his room. And a pot of yellow flowers by the window. A third visit to seal the letters, when the pot of yellow flowers was replaced by a pot of pinkish purple flowers. This was the visit when they talked about Luka. (Either that, or the pot of yellow flowers was a drawing error, and there were only two visits. 😅)

Diethard said he wanted to put Gilbert in a sack and throw him at Violet's feet. This is in reference to the time when he gave Violet to Gilbert as a "weapon". Back then, Violet was the one in a sack at Gilbert's feet.
 
Maybe I should watch it again to see if I can spot more details... 🤔
But I am going to avoid watching it in the morning. It is hard to concentrate on the movie when you are feeling sleepy. 😅

My overall thoughts on Violet Evergarden The Movie.

Events:
 
Translations of short stories:
Gilbert Bougainvillea and the Fleeting Dream (unofficial translation of "ギルベルト・ブーゲンビリアと儚い夢")
The Starry Night and the Lonely Two (unofficial translation of 星降りの夜とさみしいふたり)
Diethard Bougainvillea's If (unofficial translation of ディートフリート・ブーゲンビリアIf) 
The Tailor and the Auto-Memories Doll (unofficial translation of 仕立て屋と自動手記人形)
 
Tellsis (Nunkish) translation:
Last line of Violet's final letter to Gilbert
 
Insights on the movie:
 
Audio commentary notes:

 
All posts related to Violet Evergarden.
 
Update January 7, 2021: A state of emergency has again been declared in the area where I live, so it may be a while before I get to watch the movie in a cinema again. Hopefully, the nearby Dolby Cinema continues to air the movie for some time to come... 

Update January 10, 2021: The nearby Dolby Cinema is now the ONLY theatre in Kanagawa Prefecture (where Yokohama City is) that is still showing the movie. I don't think I will be able to catch it again... 😢 so 11 times is probably the record. Unlike Klose_Rinz_ who has watched it 68 times as of January 9, 2021...

Monday, January 04, 2021

Mori no Gakkou (2002 film)

At the start of the new year, I managed to catch a rare screening of the 2002 film Mori no Gakkou (森の学校, literally The Forest School). The movie is based on a book written by Kawai Masao (河合雅雄), and gives a snapshot of a year in the life of an elementary school student living during the early Showa period (specifically, 1935). This movie is rare in that it was only released in theatres back in 2002, and was never released on tape or DVD. Since then, it has been screened every once in a while at different places all over Japan, but never on any large scale.
 

While this is a 2002 film, it was shot like a Showa period film, giving viewers a sense of watching something filmed in the 1950s. The movie stars Miura Haruma playing the role of Kawai Masao when he was in elementary school, as he struggled with a poor constitution prone to falling sick while enjoying his love of animals. To young Kawai, the forest is like a school, teaching him things like the value of each and every living thing, the love of his grandmother, and even what it is to fall in love. Miura himself was 11 to 12 years old when he played this role, and he played it so well. A true prodigy actor, even when still a child.

If you ever have the chance, do catch this rare movie. It reminds us of the things around us that we tend of overlook. The movie is recommended by the education, labour, and environmental ministries of Japan for its contents. I think this says a lot about the educational value of the movie. It is a movie that I hope children (those in the upper years of elementary school) will watch, because it is presented in a way that they can understand. The official website (given below) has details about screenings (when they happen). It may be difficult to catch this outside Japan, though.

Trailer (from YouTube)


Updated "Nunkish" translator script

Update January 15, 2021: This post has been superseded by the bidirectional Tellsis language translator (which is the enhanced version of the script here) that can be found here. And this version that runs on Windows, Linux, and Android.

 
The online Python script that can be used to translate Nunkish (name given by fans to the language used on the Tellsis continent) into English has an issue with the googletrans module, so I found a way to update it to use the google_trans_new module instead.
(日本語ブログにも掲載しました。)
My review of the 2020 Violet Evergarden movie (VIOLET EVERGARDEN the Movie) can be found here.

To use this, you need to install the google_trans_new module using
pip3 install google_trans_new

Then, run the script using
python3 nunkish_translator_2.py

Script (nunkish_translator_2.py)
-----------------
from google_trans_new import google_translator  
import requests
 
translator = google_translator()  

alphabet = {
    'a': 'u',
    'b': '',
    'c': 'y',
    'd': '',
    'e': 'o',
    'f': '',
    'g': 'v',
    'h': 't',
    'i': 'i',
    'j': '',
    'k': 'r',
    'l': 'i',
    'm': 'p',
    'n': 'n',
    'o': 'e',
    'p': 'm',
    'q': 'l',
    'r': 'k',
    's': 'y',
    't': 'h',
    'u': 'a',
    'v': 'g',
    'w': '',
    'x': '',
    'y': 'c',
    'z': '',
    'A': 'U',
    'B': '',
    'C': 'Y',
    'D': '',
    'E': 'O',
    'F': '',
    'G': 'V',
    'H': 'T',
    'I': 'I',
    'J': '',
    'K': 'R',
    'L': 'I',
    'M': 'P',
    'N': 'N',
    'O': 'E',
    'P': 'M',
    'Q': 'L',
    'R': 'K',
    'S': 'Y',
    'T': 'H',
    'U': 'A',
    'V': 'G',
    'W': '',
    'X': '',
    'Y': 'C',
    'Z': '',
    ' ': ' ',
    '0': '0',
    '1': '1',
    '2': '2',
    '3': '3',
    '4': '4',
    '5': '5',
    '6': '6',
    '7': '7',
    '8': '8',
    '9': '9',
}

tamil_script_url = 'https://inputtools.google.com/request?text={text}&itc=ta-t-i0-und'

source_language = 'ta'
target_language = 'en'

def trans(nunkish):
    tamil = ""
    for char in nunkish:
        if char not in alphabet:
            tamil += char
        else:
            tchar = alphabet[char]
        if tchar:
            tamil += tchar
        else:
            tamil += "?"
    
    print(f"Converted to tamil: {tamil}")
    
    tamil_res = requests.get(tamil_script_url.format(text=tamil), headers={
        'Content-Type': 'application/json'
    }).json()
    
    if (tamil_res[0] == 'SUCCESS'):
        tamil_script = tamil_res[1][0][1][0]
    print(f"In Tamil script: {tamil_script}")
    return translator.translate(f'{tamil_script}', lang_src=
source_language, lang_tgt=target_language)

while True:
    nunkish = input("Input nunkish: ")
    print(f"You entered: {nunkish}")
    print(trans(nunkish.replace('\n', ' ').replace('\r', '')))
---------------------
Update January 14, 2021: To use this script with results in other languages, change the line
target_language = 'en'
to something like the below (example using Japanese as desired output language)
target_language = 'ja'
I also removed the app id and key from the post since I am not sure if the author of the original script actually wants to allow everyone to freely use that id/key pair, given that Google Translate API is a paid service now. The google_trans_new module seems to be free to use.