My Philippine Citizenship Odyssey

Background

I have a fairly unusual personal background. My parents immigrated to the US from two very different countries: Finland and the Philippines. When I grew up, dual citizenship was kind of a murky thing, which seemed to be frowned upon and discouraged. But after the turn of the millennium, there was a sea change in the way multiple citizenship was viewed by governments. Many countries actively tried to encourage dual citizenship, or the retention of one’s citizenship even after emigrating to another country.

As a result of this change, I acquired my Finnish citizenship in 2008, thanks to the Finnish nationality law, which had opened a timed window in which former Finnish citizens (such as my mother) could reacquire their citizenship, and their direct descendants could acquire Finnish citizenship for the first time. This was a very appealing prospect to me, because of Finland’s membership in the European Union, and its various agreements with its Nordic neighbors. I wound up waiting until almost the last possible moment to file my application, because for some time, there was a lack of clarity around Finland’s mandatory military service obligation, and what it might mean for new citizens. (Eventually, a clarification was issued that, for people residing outside the country, or who were 30 or over, the military service was not required.) This procrastination, and the last-minute crush of applications, meant that it took almost a year for my application to be processed and approved.

Republic Act 9225

After submitting that application, I did a little bit of research to figure out if I could do something similar with the Philippines. My father had, in theory, given up his Filipino citizenship upon his naturalization as a US citizen. I discovered, though, that Republic Act 9225 had been passed back in 2003, with the stated policy that “all Philippine citizens of another country shall be deemed not to have lost their Philippine citizenship under the conditions of this Act.” There was also a provision in the law for acquisition of citizenship by children of eligible former Filipinos, but there was a catch – it’s specifically limited to children below 18 years of age, which excluded me. My father went ahead and reacquired his citizenship under RA 9225, but that didn’t change the status of me or my siblings.

At first glance, it seemed like I might not be able to acquire Filipino citizenship, or at least not in the same way in which I had acquired Finnish citizenship (i.e. by a declaration related to my mother reacquiring her citizenship). After doing a little more reading, though, I realized that the Philippine nationality law follows the principle of jus sanguinis, and perhaps I would be eligible for citizenship at birth, even if I wasn’t allowed to acquire my citizenship through my dad’s RA 9225 reacquisition process. But after thinking a little bit more about this, I realized that I still might be stuck in a legal quandary.

Why? It turns out that my father had naturalized as a US citizen several months before my birth. And if he was no longer a Filipino citizen at that time, would I be ineligible for Filipino citizenship by birth?

I returned to the text of RA 9225, and pored over it. Despite the stated policy in the document that former Philippine citizens “shall be deemed not to have lost their Philippine citizenship,” the rest of the document was littered with the word “re-acquire,” which made me think that, legally, citizenship was something that was dropped, and then picked up again later. Maybe the “stated policy” was actually at odds with the rest of the document – the actual legalese. This would be bad news for me, with regards to my dad being considered a Philippine citizen at the time of my birth. This left me confused and unclear as to the legal status of my birth, and trying to pencil out various possible scenarios:

  • What if Dad lost his Philippine citizenship upon US naturalization, and only reacquired it when he reclaimed his citizenship? In this case, I would be out of luck, without any real way to become a Filipino citizen apart from the normal naturalization process.
  • What if Dad lost his Philippine citizenship upon US naturalization, but after reacquiring it, was considered to always have remained a citizen of the Philippines? With this reading, I think I would be eligible for citizenship by birth.
  • What if Dad was considered to always have remained a citizen of the Philippines, but the question of my birth status was irrevocably resolved at the moment of my birth? This is what I worried about the most – I would be forever ineligible, by virtue of falling into some weird edge case of the relevant laws.

In my confusion, I sent a couple of desultory e-mails to one of the Philippine consulates here in the US, and to the owners of some pages offering advice on Philippine immigration issues. The responses that I got were confused, and were of the opinion that I might be out of luck, but it wasn’t clear how deeply these responses were considered, or whether any higher levels of the bureaucracy were consulted. My follow-up requests for clarification often went unanswered.

As a result of this discouraging start, I set the project aside for many years, unsure how to proceed. I thought about trying to find and hire legal consultation familiar with these issues (either here in the United States, in the Filipino community, or in the Philippines itself), but never really pursued it seriously.

Fast forward to 2019, when my interest in this project was reignited by a family trip to Asia, including a big chunk of time spent in the Philippines. My kids loved going to the Philippines, and so I thought about getting my Philippine citizenship once again, now with the children in mind. If I could claim my citizenship from birth, my kids could also claim this birthright from the country which had captured their hearts. So I decided to just try applying, and to see what happened. During this process, as it turns out, I did more online research, and found legal precedent strongly supporting the idea that I was indeed eligible for birthright citizenship.

The Process

I would need to report my birth, as the birth of a Filipino abroad, to the Philippine consulate in New York (the consulate responsible for the region of my birth). But before I could do that, I would first need to establish that my parents were actually married, in the eyes of the Philippines. (I guess I could have just tried to report my birth as an illegitimate child, to cut down on paperwork, but that seemed…undesirable.)

This turned out to be another adventure, as my parents had actually gotten married in Canada, and not the United States! So I had to help them file a Report of Marriage, with the Philippine Consulate in Toronto. This necessitated ordering official copies of their marriage certificate from Ontario, getting some passport pictures of my parents, and then getting their signatures notarized on various forms. Because the marriage had taken place so long ago, and because the Philippines is basically the only country where divorce is still not recognized, this also required ordering a “Certificate of No Marriage Record” (CENOMAR) from the Philippine Statistics Authority, to certify that they had no previous record of either of my parents being married.

After getting this taken care of, and receiving the stamped, approved copy of their Report of Marriage application in the mail, I asked if I could simply include that with my report of birth application, instead of having to wait to be able to order an official copy from the PSA. (The approved forms are only transmitted back to the Philippines once every few months, so you might need to wait several months before you can even order an official copy from the PSA.) Thankfully, the consulate told me that I could just include the approved report of marriage that I got in the mail, and so that was what I did. I carefully made all the required photocopies of documents (a somewhat puzzling requirement, since we live in the Information Age with easily-scannable documents), got my signatures notarized on several different forms, put together the package according to the instructions on the consulate site, and sent it off.

The Legalities

I mentioned earlier that I had found some relevant precedents in the Philippine legal system, which seemed to settle that burning question I had with regards to my eligibility for birthright citizenship. The first petition (and ruling) concerned a lawyer who moved to Canada, became a Canadian citizen, reacquired his Philippine citizenship under RA 9225, and then moved back to the Philippines in 2006, intending to resume his law practice. In the Philippines, the practice of law is restricted to citizens, which raised the question as to whether the petitioner was still even a member of the bar association at all, after becoming a Canadian citizen. It was ruled that, due to the wording of the intent of RA 9225 that “Philippine citizens of another country shall be deemed not to have lost their Philippine citizenship,” the petitioner never lost their bar membership, and merely needed to get current on his payment of dues to the bar association, participate in some continuing education, and retake their lawyer’s oath. Because there are no other requirements other than citizenship of one parent at the time of birth, for a child to be a Philippine citizen, my case now seems very clear-cut. Regardless of what my dad may have thought at the time, he was a Philippine citizen at the time of my birth, and therefore, I am one as well, and have been from the moment of my birth (even though it took me a few decades to get around to notifying the Philippines about this).

The second petition regarded a similar situation, except that the petitioner had become a US citizen in 1981. The positive ruling on this petition just reinforced my feeling that I was indeed able to claim my citizenship at birth, as there seemed to be no further conditions or considerations about the length of time elapsed.

The Result

In a short while, I received the approved documents back from the consulate in New York. I was indeed now officially a Filipino by birth! After a couple of months, I was then able to order an official copy of my Report of Birth from the PSA. I’ve continued working towards the goal of getting my kids their Philippine citizenship, reassured now that it’s just a matter of working through the remaining bureaucracy, and no longer a question of whether I’m eligible at all to be a citizen.

The key point to take away from my experience is that as long as your Filipino parent has reacquired their citizenship through RA 9225, it doesn’t matter whether they were naturalized as a citizen of another country when you were born. So if you were in the same situation as me, fear not! You can get it sorted out, and claim your Philippine citizenship by birth.

Upgrading the Wireless Adapter of a Dell XPS 8900

Because it was kind of difficult to find information about this, I figured I would post some details about a tiny little upgrade that I did to a Dell XPS 8900. It originally came with a Dell DW-1801 wireless adapter, which only supports 802.11 b/g/n, and not ac. Also, it doesn’t support the 5 GHz frequency, which is really unfortunate.

All of this means that wireless performance is kind of shaky upstairs in our house. I had started researching Wi-Fi extenders, and then went to check to see exactly what 802.11 ac profiles our router and wireless adapters supported, and was shocked when I realized what the problem was. I remembered that our XPS 8900s came with integrated wireless, and I seemed to remember that it was an M.2 device, so I looked to see if I could just swap out the wireless card with something better.

Amazingly, in spite of all of the exhaustive specification information you can find on the Internet, it was actually pretty hard to figure out what M.2 physical connector the existing DW-1801 card used. I tried looking through my system and service manuals, and wasn’t able to find that information anywhere. This thread on Dell’s support forums was the best source of information I found, and indicated that it is size 2230 with key A E.

I looked around to find a replacement adapter, and ultimately opted for this Intel 8260 NGWMG that I found on Newegg. Before installing, I went ahead and downloaded and installed the drivers from Intel. The physical installation was pretty straightforward, once I found the correct screwdriver to take off the old adapter and antenna wires. On bootup in Windows 10, the adapter was recognized and it started working perfectly once I rejoined the network.

So, yeah, for curious people on the Internet – you can, in fact, upgrade the wireless adapter in a Dell XPS 8900 with a third party M.2 card.

Ludum Linguarum: Aurora

(Ludum Linguarum is an open source project that I recently started, and whose creation I’ve been documenting in a series of posts. Its purpose is to let you pull localized content from games, and make flash cards for learning another language. It can be found on GitHub.)

In this post, I’ll talk a little bit about Ludum Linguarum’s support for some of the Aurora-engine based games that are out there. The Aurora engine was Bioware’s evolution of its earlier Infinity Engine, and was used in quite a few games overall.

Motivation

There are quite a few games with large amounts of text that were produced with the Aurora engine (including one that I worked on), so it seems quite natural to try and target it for extraction. The text in these games can also be categorized in some ways that I think are interesting, in the context of language learning – there are really short snippets or words (item names, spell names, skill names, etc.), as well as really lengthy bits of dialogue that might be good translation exercises. Additionally, there’s quite a bit of official and unofficial documentation out there around its file formats.

Goals for Extraction

The raw strings for the game are (mostly) located inside the talk table files. However, just extracting the talk tables would lose all context around how the strings are actually used in the game. For example, the spell names, feat names, creature names, dialogues, and so on, are all jumbled together in the talk table. It sounds like a small thing, but I feel that creating a taxonomy (in the form of “lessons”) would make a big difference in the usefulness of the end product. Unfortunately, it also makes a huge difference in the amount of effort needed to extract all of this data!

How it all went

I spent quite a bit of time writing file-format-specific code, for things like the TLK table format, the BIF and KEY packed resource formats, the ERF archive format, and the generic GFF format. On top of that, then there was code to deal with the dialogue format that gets serialized into a GFF.

I started with the original Neverwinter Nights, and then moved on to Jade Empire. The console-based Aurora engine games used some variant file formats (binary 2DAs, RIM files, etc.) that needed a little extra work to deal with, but there was enough information about these available on the Internet that I was able to support them without too much hassle.

Once I had the basic file parsing code in place, it was just a matter of constructing the “recipe” of how to extract the game strings. This mostly involved sifting through all of the 2DA files for each game, looking for columns that represented “string refs” (i.e. keys into the talk table database) – extracting dialogues was much simpler since they were already in their own files, and their contents were unambiguous.

Comparison between C# and F# implementations

I had basically written all of this file parsing code before (in the C# 2.0 era, so without LINQ), but this time around I was writing it with F#. I found it very interesting to compare the process of writing the new implementation, with what I remember from working on Neverwinter Nights 2 more than 10 years ago.

The F# code is a lot more concise – I would estimate on the order of 5-7x. It isn’t quite an apples-to-apples comparison with what I did earlier (for example, serialization is not supported, only deserializataion), but it’s still much, much smaller. I suspect that adding serialization support wouldn’t be a huge amount of additional code, for what it’s worth.

Record types and list comprehensions really help condense a lot of the boilerplate code involved in supporting a new file format, and match expressions are both more compact, and safer when dealing with enumerated types and other sets of conditional expressions. I also got lots of good usage out of Option types, particularly within the 2DA handling, where it very neatly encapsulated default cell functionality.

But I think the thing that accounts for the biggest difference between my old C# implementation and the new F# implementation, is the range of functional abstract data types available – or, to put it another way, the lack of LINQ in my C# implementation. If LINQ were available at the time I was working on Neverwinter Nights 2, I think my code would have looked a lot more like the F# version, with liberal use of Select()/map() and Where()/filter(). These operations replace very verbose blocks of object construction and selective copying, often in a single line, which is an enormous savings in code size and improvement in clarity.

I feel like there is still a lot of bespoke logic involved, for extracting the individual bits and pieces of each format, but that doesn’t seem to be avoidable – the formats are not self-describing, and it seemed like it would be overkill to try and construct a meta-definition of the GFF-based formats.

Summary

Overall, I was pretty pleased with how this went. While it was a decent amount of work to support each file format, once that code was all written, the process of creating the game-specific recipe to extract strings was pretty straightforward. There weren’t really any surprises in the implementation process, which was definitely not the case for the game that I’ll talk about in my next set of posts.

Ludum Linguarum: The Simple Stuff

(Ludum Linguarum is an open source project that I recently started, and whose creation I’ve been documenting in a series of posts. Its purpose is to let you pull localized content from games, and make flash cards for learning another language. It can be found on GitHub.)

When I started this project, I figured that support for individual games would fall into one of a small set of categories:

  • Low effort, where the strings are either in a simple text file or some sort of well-structured file format like XML, where many good tools already exist to pull it apart.
  • Cases where the file formats, while bespoke, are well documented, and where there may be tools and code that already exist to parse the file formats.
  • The really hard cases – ones where there isn’t a lot of (or any) extant information about how the game stores its resources, and extracting strings and metadata about them is more of a reverse-engineering exercise than anything else.

In this post, I’ll talk very quickly about a few really simple examples of games that I was able to knock out very quickly: King of Fighters ‘98, King of Fighters 2002, Magical Drop V, and Skulls of the Shogun.

King of Fighters ‘98 and King of Fighters 2002

While I was working on this project, I started on some of the other supported games first. But then, I decided to take a little break, and see if there were any games out there that would be really trivial to support. I just started browsing through my Steam library, and realized that fighting games were probably a good candidate – they contain limited amounts of text, but were definitely globalized.

Both of these games use the Xbox 360 XDK’s XUI library formats to present their UI. (I determined this by the presence of some files and directories with “xui” in their name.) All of the strings in the game are inside a file conveniently named strings.txt inside the data directory.

This is a tab-delimited format with just four columns – a key for the string, a “category” comment field, and then columns for each supported language – “en” for English, and “jp” for Japan. (It’s interesting that the country code rather than the language code was used for Japan – I’m not sure if that was an unintentional mistake.)

In this case, it’s super simple to extract all of the strings, because of the simple formatting, and the one place that I need to look to find them all. I simply read in the file, and directly map the key column to the per-language text for each card.

(It’s worth noting that King of Fighters XIII doesn’t use the same format or engine, so I wasn’t able to just add support for it using the same code.)

Magical Drop V

Adding support for Magical Drop V just involved reading some XML files within its localization subdirectory, and massaging them slightly to remove invalid and undesirable text. For example, ampersands were not escaped in the XML files, which caused the .NET framework’s XML parser to complain. I also stripped out some obvious placeholder values (“<string placeholder>”).

Overall, it was really quite simple to add support for this game, with the game-specific code only running to about 50 lines.

Skulls of the Shogun

Skulls of the Shogun is a game built on XNA and MonoGame, and actually uses the .NET framework’s globalization support to localize its strings. Thus, I was able to use the framework’s support for loading satellite assemblies to pull out both the string keys used to refer to the strings, as well as the content itself, quite easily.

I actually spent more time determining that I had to load the assemblies using the reflection-only context, in order to keep my library and console application bit-length-independent, than writing the rest of the code to support this game!

Ludum Linguarum: The Tools

(Ludum Linguarum is an open source project that I recently started, and whose creation I’ve been documenting in a series of posts. Its purpose is to let you pull localized content from games, and make flash cards for learning another language. It can be found on GitHub.)

When I started working on Ludum Linguarum, I decided to use it as an opportunity to exercise what I had been learning about the F# language on the side. This might seem like kind of a strange decision out of context, but there were a few reasons why I felt that this made sense:

  • I already had a good bit of familiarity with the .NET stack, having spent a good chunk of my years in the gaming industry writing tools in C#.
  • Because of the frequent use of C# in games and editors, I felt that there would be a greater likelihood of me finding useful, easy-to-integrate libraries and documentation for reverse engineering games than on other stacks.
  • I write Scala at my day job, so I figured that I would be reasonably well-equipped to deal with the functional programming aspects of the language, even if I had never really touched Ocaml before.

At the very beginning of the project, I was working on learning F# on my commute, using Mono and MonoDevelop on an old netbook that I threw Ubuntu on. This worked (in that it is totally possible and viable to write F# and .NET code on non-Windows platforms), but later on I got a proper new laptop, threw Visual Studio 2015 on it, and never looked back. The added benefit of doing this, of course, was that, running under Windows, I could easily install and run the games that I was reverse engineering.

The benefits

All in all, I have been very pleased with my decision to use F#. Using a functional-first language let me construct composable, easily-testable pipelines, and I feel this really saved me a bunch of time as the project grew. The language is very similar in capabilities to Scala for application code, albeit with significantly different syntax and a slight verbosity tax.

When I think back to similar code I’ve written in the past, I feel that my F# code is more concise, easier to understand, and with less room for bugs to creep in, compared to C++ and C#. This applies both for simple parts of the code, as well as much more complex parts. In a future post, I’ll go into this in some more detail.

I would go so far as to say that the things that slowed me down the most were when I strayed furthest from the functional style, and just used the full console application and the full game data as my testbed. (The reason I did this is that it can be a bit of a pain to construct test data that is compact, concise, and doesn’t include any actual copyrighted material.) As long as I wasn’t too eager, moved at a reasonable pace, and built up a decent test corpus, things worked out well.

Project setup

Initially, I used the standard .fsproj and solution setup in VS. The project was set up as a plugin-based system, where all main build outputs were copied into a single output directory, and NUnit test projects were simply run in-place. This worked OK, but as I got closer to actually releasing the first version of the project, I decided that it would be better to migrate the project to the FAKE build system and Paket dependency manager. (Using those makes it simpler to keep dependencies up-to-date, and hopefully easier for the curious or motivated to build and run the project.)

I used the open source F# Project Scaffold, and reconstructed my old project setup. It took a little bit of experimentation, but I was able to get up and running pretty quickly. I did run into an issue where the recently-released NUnit 3 isn’t supported by FAKE, and I did have to do some legwork to get everything building with F# 4 and .NET framework 4.6.1, but it wasn’t too bad. Now I have a very simple system to build, test, and package the project for release.

The latter is particularly important – I don’t have a lot of extra time to spend on overhead like manually making builds and uploading them, so it’s much easier for me to change the way I work, and change my project to conform to some existing conventions. One example of this is that the console program used to have no project dependencies on its plugins – they were copied as a post-build step into a separate plugins directory in the build output. This was done out of a bit of a purist mindset (and was what I had done on some other projects in the past) – but when I migrated to FAKE, this presented some problems, as it was difficult to duplicate this exact behavior. The solution was to simply abandon purity, adjust the way that I did things, and just add project dependencies to the console application against the plugins. Realistically speaking, anyone developing a plugin is probably going to have the full source handy anyway, so why get hung up on this?

Other libraries

So far, I’ve pulled in just a few other libraries. One is sqlite-net, a SQLite wrapper, and another is CommandLineParser, to allow me to construct verb-driven command line handling in the console application. I spent a little while wrestling with both, but now I have a couple of wrappers and things generally set up in a way that works well. (I actually switched back and forth between the old version of CommandLineParser and the new beta one, and wound up sticking with the new beta as it fixed at least one annoying crash relating to help text rendering when using verbs.) I also wound up adding the venerable SharpZipLib library for zip archive support.

Summary

In summary, I’m glad that I have a setup now, using FAKE and Paket via the F# project scaffold, which is good for rapid development in Visual Studio, has good testing support, and one-line release packaging and deployment. There were a few bumps along the way in arriving at this setup, but I can wholeheartedly recommend it to anyone working in this ecosystem.

Introducing Ludum Linguarum

I’ve been working on a side project for some time now, and it’s gotten far enough along that it’s worth releasing it, and discussing it. It’s called Ludum Linguarum (LL) – a little awkward, yeah, but I figured that a unique name would be better in this case than spending a lot of time trying to find an available-yet-expressive one.

What does it do?

Well, it’s intended to be a tool for extracting localized resources from games, and then converting them into language learning resources. In other words, the end goal is that you can turn your Steam library (full of games that you bought for $0.99 in some random bundle) into material to help you learn another language.

The current version pulls strings from games, and turns them into flash cards for use with Anki (and compatible apps). LL supports 21 games right now, and the goal is to expand that over time.

Why write something like this?

Well, it involves two things that have always interested me (games and languages), and as far as I know, nothing else like this exists! (subs2srs is a tool in a similar vein, but it generates flash cards from subtitled videos instead.) I figure you might be able to get a little extra motivation and drive by learning another language in the context of gaming.

Another reason is that the vocabulary of games is often well off the beaten path of most language courses – I don’t think that Rosetta Stone or even Duolingo is going to tell you that “magic missile” is Zauberfaust in German. There aren’t that many opportunities to learn this stuff otherwise – think of it like professional vocabulary, but for a really weird job.

I also find cultural differences interesting, and that includes the way that game content gets translated. Seeing how colloquialisms and “realistic” conversation get translated is really interesting to me – I get a huge kick out of learning that platt wie Flundern is how someone chose to translate “flat as a pancake.”

Finally, game content in itself is an interesting treasure trove where you can often see the remnants of things that were tried and abandoned, or cut in order to get the product to the finish line. And naturally, some of the most common types of remnants are text and audio.

Next Posts

I’m going to spend the next few posts talking about the development of Ludum Linguarum, and writing the code to extract strings out of the first few games it supports. There were quite a few interesting problems that came up while getting to this point, and a few interesting tidbits and trivia that I can share about some of the supported games.

Open Live Writer

This is just a test post to try out Open Live Writer on my blog. I used to use the old Live Writer a bit, and was glad to hear that it had recently been open sourced.

So why am I all of a sudden interested in blogging again? Well, I have a few articles that I’d like to write, relating to a little side project that I’ve been working on, and I really like the WYSIWYG and native-client feel of Live Writer versus the WordPress admin UI.

Stay tuned! Open-mouthed smile

Compacting a VHD

I was looking to back up some VHD containers that I use to store files in Windows, and needed to trim one of them down before it would fit under the OneDrive 10 GB upload limit. Since it was a dynamically expanding VHD, just removing files from the container wasn’t sufficient to reduce the actual size of the VHD file. Once I was done, I needed to unmount the drive, and then compact it using the diskpart utility. Here are the steps I followed:

  1. Run the diskpart command from a command prompt.
  2. Enter select vdisk file="path to VHD file".
  3. Enter attach vdisk readonly.
  4. Enter compact vdisk. This will compact the VHD file, and might take a little while.
  5. Finally, enter detach vdisk and then exit. This will detach the VHD file and exit diskpart.

Once this is done, your VHD size should be reduced to the minimum necessary to store the files within!

Analysis of Yakuza 5 Hack Videos

I happen to be pretty hyped over the upcoming US release of Yakuza 5 — I’m a big fan of the series’ odd mix of ridiculous melodrama, wide variety of activities and minigames, and really satisfying combat. So naturally, after the localization was announced, I went around looking for videos of the game to watch. First off, I found this amazingly comprehensive and lovingly-assembled survey of the whole series — it’s not really related to the rest of this post, but if you’ve never seen these games it’s worth watching to get a glimpse of how unique they are, and what’s so appealing about them to me.

Then, I found some videos of some hacks that someone apparently made to the game, to allow the player to play as Haruka and another female character (Mai — no idea what her place in the story is). During the normal story arc, there is a chapter where you play as Haruka — however, her fights are rhythm games and dance battles, not the sort of bare-knuckle brawls for which the series is famous. This hack instead allows you to play as these characters during other chapters of the game, where you engage in tons and tons of fistfights. And, somewhat surprisingly, if you watch the videos, it looks pretty good!

So, putting on my ex-game developer hat, what do these videos tell us about the way the game is built, and why this was possible? And is adding a new playable character to the game as simple as these videos make it seem? Here are some of my observations and speculation on how this works, and its limitations.

  • The female and male characters must be animated using the same skeleton. Basically, because all of the combat animations that these characters are using are the same ones that the standard playable characters use, Haruka and Mai must be built and animated on the same basic skeletons as Kiryu, Saejima, Akiyama, and Shinada. This is a little surprising to me, but it goes a long way to explaining why the female characters in this series always seem to…uh, have a somewhat mannish feel to them. I’m guessing that, for the original PS2 games, that this was done to save memory, and then brought forward because it worked well enough and making unique skeletons would require duplicating an already-large animation set.
  • Haruka and Mai are missing a lot of animation metadata. The most obvious case is when Haruka goes to light up a relaxing cigarette after beating the tar out of countless schlubs, just like uncle Kiryu.
  • The smoke from the cigarette comes out of Haruka’s chest — or, more accurately, the origin (0, 0, 0) of the character. There’s a missing animation attachment point in Haruka’s metadata, and the game engine falls back to the origin. Interestingly, Mai seems to have this attachment point — the smoke appears in the correct place for her.

    Another example of missing metadata is this HEAT attack with a bowling ball — it pops away from Haruka’s hand and looks like it’s stuck on her nose. And if you freeze-frame a similar HEAT attack with a beer bottle, you can see that the bottle looks like it’s stuck on her lips.

    I believe there are also missing camera focus points — for example, at the end of this HEAT move, the camera seems to be focused on the origin point of Haruka, and her face is off camera. If I remember correctly, this move looks different when performed by one of the other player characters — the camera tracks the head and it’s in the frame.

  • They’re also missing a lot of animations. The easiest case to spot is that Haruka and Mai’s faces remain completely expressionless, and possibly unblinking, during fights — they don’t have any combat “barks” (voice + facial animation), and they don’t play any reaction or pain facial animations as they lay waste to their foes. While it kind of lends a comic tone to the video, this would definitely not be acceptable for an officially supported character. It just looks strange.
  • The game’s IK seems to work OK with these characters. I had kind of assumed that the engine supported IK, given that a lot of the close combat grabs in the game look pretty good. When Haruka grabs a thug by the hand, that’s a pretty strong signal to me that they’re doing some limited IK, because if they weren’t, you would probably see a gap in the throw animation as Haruka’s character is physically smaller than, say, Kiryu’s. Another example of this is Mai kicking the sign stuck on a thug’s head — it’s just too unlikely that it would look good without IK support.

    Note that there are still some cases where it looks like they don’t normally use IK, and just rely on the animations fitting the sizes of the characters — Haruka lifting up a thug looks pretty bad, as her hand is nowhere near the thug’s chest.

  • Haruka and Mai’s hair is not built to be animated during combat. In the case of Mai, her hair basically doesn’t move at all. And Haruka’s hair physics object was clearly conditioned to look good during movement animations, but not tuned at all for anything that would look like combat, with its frequent flips, tumbles, falls, and dashes. It’s all over the place constantly.
  • Both characters seem to be using Akiyama’s move set. But I can’t tell if this is just a convenience, a deliberate stylistic choice on the part of the author, or that none of the others would work. I think it would be kind of funny to see them using Saejima’s brawling moves, though.
  • Surprisingly, there was no content protection on the game assets. Presumably the author of these videos was able to simply pull out the PS3 HDD, and modify the files directly on the hard drive to point a character definition to Haruka or Mai’s models. It’s a little surprising to me that these were left unprotected, but perhaps Japan has less societal anxiety about hot coffee than the US. Maybe I should have a look at the installed data, to see if I can verify any of my conjecture here.

In closing, I think these are really neat, fun videos to watch, and that it would be very cool if female characters in future Yakuza installments were able to fight and brawl. But there are enough rough edges and missing content in this hack, that it should be clear that making them fully playable is not just a matter of flipping a switch (or deciding to change the story) and suddenly having Haruka powerbombing fools alongside her uncle Kiryu. There’s a lot of missing content and additional polish that would need to go into making Haruka and Mai fully first-class fighting characters in the game.

Questions or comments on my analysis are welcome!

Operation Stop Junk Mail

I am sick and tired of receiving junk mail. It wastes my time, it wastes resources, and it generally has no redeeming value whatsoever. Even worse, I feel that for some classes of junk mail (stupid stuff like balance transfer checks, which I will never ever use), I need to take special care to shred it to avoid identity theft or scammery. So, I’m going to try to do everything I can to stop junk mail from being sent to me, and document everything that I’ve done in the hopes that it gives other people some ideas on how to stem the tide of garbage hitting their mailbox.

The first step on this journey is the FTC’s “Stopping Unsolicited Mail, Phone Calls, and Email” page. Here you will find:

  • optoutprescreen.com, which is a site created by four major U.S. credit reporting companies to allow you to opt out of pre-screened credit card offers. You can opt out for a period of five years electronically — to opt out permanently, you need to mail in a signed form (which is a ridiculously weaselly requirement that is just trying to raise the pain threshold for truly opting out). Considering how much junk mail I get that consists of credit card offers, this seems like a great place to start. Note that unlike USPS mail forwarding, every individual in the household will need to opt out.
  • The government’s “do not call” registry, www.donotcall.gov. While this doesn’t actually address junk mail, it’s such a basic quality-of-life improvement that it’s worth including anyway.
  • The Direct Marketing Association’s “Mail Preference Service” site, www.dmachoice.org. This lets you opt out of several categories of junk mail. They also have an “e-mail preference service” which alleges to reduce unsolicited commercial e-mail.

Beyond that, now you’ll need to start on some other companies with whom you probably do business, and sell your name and address to “marketing partners.” The primary ones that I’m focusing on are banks, credit card companies, and airlines, since those seem to constitute most of the garbage offers I get in the mail. The general rule of thumb is that the “opt-out” switches tend to be hidden in each company’s “Privacy Policy” section of their site — if you can’t find any way to opt out of slimy sleazy marketing in the normal account settings, check their privacy policy. (I’m guessing that there’s a legal reason for this, but I haven’t dug into the specifics.)

I’m going to start with these and see how it goes. Hopefully this will eradicate a significant amount of hassle and wasted time and resources!