Friday, August 30, 2019
Add a Separator in macOS Dock
We can use a space separator to ground related apps together in macOS dock.
Remove Code Coverage and Warnings for Cocoapods
When build and running tests with code coverage enabled for iOS projects with CocoaPods, the code in the included pods will also be taken into account by default. In order to disable code coverage and warnings from included pods, add the below block to the
Podfile
.# Disable code coverage for pod files
post_install do |installer|
installer.pods_project.targets.each do |target|
target.build_configurations.each do |config|
config.build_settings['CLANG_ENABLE_CODE_COVERAGE'] = 'NO'
config.build_settings['SWIFT_EXEC'] = '$(SRCROOT)/../SWIFT_EXEC-no-coverage'
config.build_settings['PROVISIONING_PROFILE_SPECIFIER'] = ''
config.build_settings['CODE_SIGNING_ALLOWED'] = 'NO'
config.build_settings['CODE_SIGNING_REQUIRED'] = 'NO'
end
end
installer.pods_project.build_configurations.each do |config|
config.build_settings['PROVISIONING_PROFILE_SPECIFIER'] = ''
config.build_settings['CODE_SIGNING_ALLOWED'] = 'NO'
config.build_settings['CODE_SIGNING_REQUIRED'] = 'NO'
end
end
Create a file SWIFT_EXEC-no-coverage
in the root folder at the same level as Podfile
with the below snippet and make the file executable by running chmod +x SWIFT_EXEC-no-coverage
.#! /usr/bin/perl -w
use strict;
use Getopt::Long qw(:config pass_through);
my $profile_coverage_mapping;
GetOptions("profile-coverage-mapping" => \$profile_coverage_mapping);
exec("/Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/swiftc", @ARGV);
Thursday, August 29, 2019
Hello, Dreamboat. Shall we chat?
The story goes like, I got a gig to work on a bot that should work with Artstation.com website. The requirement was that there must be some automated way to send messages to users of ArtStation. So I used AppleScript to interface with Safari and JavaScript to bridge between AppleScript and the web page. The bot or rather the script will get a list of user's profile link from a local file and then load them in Safari. For the initial launch, if the sender is not signed-in, the bot will sign in with the sender credentials loaded from a creds file. Then the bot will invoke a couple of JavaScript bridging methods to simulate a click on the buttons, add text in the message field, trigger change event so that AngularJS recomputes the constraints and enables the submit button. Then it invokes submit button tap which will send the message to the user, and finally closes the tab and if all tasks were done, closes the browser.
Everything is cool, right? Not quite so, because now the requirement changes and we need to have some intelligent behaviour for the bot, plus there won't be any list provided. And the bot has to figure out which message should be sent to which category of user and has to get all the users that the website has. So basically the bot is a crawler plus an intelligent messenger. Okay, so let's add some intelligent behaviour and spawn a couple of crawlers. When it comes to AI, true randomness marks the height of intelligence ;) So with that in mind, I went with a full-fledged re-architecture.
The app is a Cocoa app which now uses a web view to do the messaging part. This gives more fine-grained control over the web page loading events and such. The JavaScript communicates to the native code using WebKit message handler. There isn't much for the UI for the app as the main focus was on the functionality. I added some screens to view the details of the crawler, messenger and to configure messages and sender credential, which gets persisted to the database.
Unfortunately, there is no developer API the website provides. Life would have been much easier otherwise. I did some debugging of the JavaScript the site loads and figured how their API service works. Just for the kicks, I wrote the crawler and frontier services in Objective-C. The crawler first gets the Anti-CSRF token so that the request can get through the CloudFront security validation. Without the CSRF token, we will get a captcha which is hard to solve for my bot ;) Then the bot calls the user v2 API which uses the same params the website uses to get the list of users which returns data in JSON format. The JSON data reflects their model layer, presumably that of a NoSQL DB.
Now that we have the users list by category, we need to persist the data. And there are probably close to a million users, so I need a DB that scales well. So I went with FoundationDB with the Document Layer. As a matter of fact, FoundationDB powers iCloud, and the DB is distributed, fault-tolerant, scalable architecture is very promising. All of the DB setups went really well. Now I needed a MongoDB driver to talk the document layer. So I used the official MongoSwift library, but now Swift Package Manager refuses to work because it sensed the presence of Objective-C code. After wrestling with SPM and Xcode and MongoSwift, I just wrote the
We can set a message template for each category and the bot will interpolate the string before sending the message to each user. Now the bot says Hi to ArtStation users.
Everything is cool, right? Not quite so, because now the requirement changes and we need to have some intelligent behaviour for the bot, plus there won't be any list provided. And the bot has to figure out which message should be sent to which category of user and has to get all the users that the website has. So basically the bot is a crawler plus an intelligent messenger. Okay, so let's add some intelligent behaviour and spawn a couple of crawlers. When it comes to AI, true randomness marks the height of intelligence ;) So with that in mind, I went with a full-fledged re-architecture.
The app is a Cocoa app which now uses a web view to do the messaging part. This gives more fine-grained control over the web page loading events and such. The JavaScript communicates to the native code using WebKit message handler. There isn't much for the UI for the app as the main focus was on the functionality. I added some screens to view the details of the crawler, messenger and to configure messages and sender credential, which gets persisted to the database.
Unfortunately, there is no developer API the website provides. Life would have been much easier otherwise. I did some debugging of the JavaScript the site loads and figured how their API service works. Just for the kicks, I wrote the crawler and frontier services in Objective-C. The crawler first gets the Anti-CSRF token so that the request can get through the CloudFront security validation. Without the CSRF token, we will get a captcha which is hard to solve for my bot ;) Then the bot calls the user v2 API which uses the same params the website uses to get the list of users which returns data in JSON format. The JSON data reflects their model layer, presumably that of a NoSQL DB.
Now that we have the users list by category, we need to persist the data. And there are probably close to a million users, so I need a DB that scales well. So I went with FoundationDB with the Document Layer. As a matter of fact, FoundationDB powers iCloud, and the DB is distributed, fault-tolerant, scalable architecture is very promising. All of the DB setups went really well. Now I needed a MongoDB driver to talk the document layer. So I used the official MongoSwift library, but now Swift Package Manager refuses to work because it sensed the presence of Objective-C code. After wrestling with SPM and Xcode and MongoSwift, I just wrote the
FoundationDBSevice
, which is the persistence layer in Objective-C again so that I can call directly into the Mongo C driver to work with the DB. Less pain. If I had used only Swift, working the SPM would have been a breeze and could have used MongoSwift readily. Nevertheless, the bot now crawls the website saving users based on the category to the local FDB. To not DDoS their website, I used the GKGaussianDistribution
that comes with GameplayKit to generate a random number within a specified set of mean and standard deviation values, and uses this value along with current time to schedule the crawl. The same logic is used for the messenger as well, but with a different set of mean and SD values. The bot saves the state of each crawl so that the next time it starts, it can crawl (fetch) users from where it left off. Users are messaged only once for a category. If a user belongs to multiple categories she will get different set of messages relevant to that category. The sender details can be added from the settings which are persisted in the DB except for the password, which is stored in the macOS keychain.We can set a message template for each category and the bot will interpolate the string before sending the message to each user. Now the bot says Hi to ArtStation users.
Check out the source code at Github and let me know what you think.
Subscribe to:
Posts (Atom)