A geeky personal assistant

Most of our interactions with a computer start with a text input box. Either the start menu, Google, or the browser url bar.

The new personal assistants devised by the big software companies are robots that are created to obey our commands and do what we tell them

hmm…

Jurassic Park Computer

wasn’t the original command line exactly an attempt to do that? The 1970’s and 1980’s processing power and computer programming accumulated knowledge wasn’t enough to make this interface easily usable and the focus shifted to other things like buttons and menus. The few features of those programs made it easy to arrange a set of icons of the screen one for each function.

Once the features started to increase, deeper and deeper menus and dialogs proliferated and finding a function or option was obviously harder when digging through endless windows than searching through a text file. But searching through text files or using the command line was already deemed old-fashioned and forgotten.

Even today, when somebody sees me typing into the linux terminal I get all kinds of reactions of awe and disgust , as if I am doing something magical. Most things I do through the terminal however are really easier to do that way.

 $ convert image.png image.jpg

for example, is much easier than opening the image in GIMP, waiting for it to load fonts and extensions and then hitting file > export and then OK twice

So, what is the problem?

Discoverability. It’s hard to know what command to use and what it’s syntax is. This is greatly exaggerated by the multiple developer nature of GNU/Linux as each command has it’s own syntax and there is no consistency whatsoever

tar bomb

Can it be fixed?

In my opinion, easily. Using standard tools like aliases to make commands have more obvious names and by taking ideas from modern IDE’s to add visible autocompletion and help messages. This needs a couple of brainstorming sessions but here’s a first idea

terminal autocomplete preview

How this affects nemo?

Nemo’s target audience, at least for now are geeks: Geeks like telling their computers what to do and are not afraid of using the terminal. So my proposal is to modify the home-screen search box to also accept terminal commands.

homescreen search view

The commands will have a faded autocomplete with included help, and the various autocompleted options will transform into drop down menus when clicked. All this functionality will use plain old bash autocomplete to work.

homescreen search autocomplete drop down

Bash autocomplete can also be used to increase the size of the keyboard’s sensitive area for letters that are more likely to be next.

More traditional commands will be executed and the output will be printed in the results space in monospaced font, along with some relevant options.

homescreen search terminal view

There will be a whitelist for commands that execute instantaneously and the output of those commands shall be displayed as-you-type in the space below the search box

What is needed?

Not much. Apart from some GUI work, we just need to create sane aliases and commands for most phone functionality. These commands should be a bit lax on the syntax, for example the call command should accept call <contact/number> [[on|at|] <numbertype>] so all of call Eva at home, call Eva on mobile and the more classic command-like call Eva mobile produce valid output.

There is no need to accept please call Eva on mobile as the goal is not to create a natural language parser but rather to be similar enough so that commands are easy to remember.

4 thoughts on “A geeky personal assistant

  1. Wow this is soo great! I’m really looking forward for using Glacier UI in the future!

  2. This is a nice thought! But am kind of confused as to how this would work perfectly! So its like a personal assistant based on text input, unlike the common voice input method. But how does this work is the search being carried out as you type! And command when you hit ‘enter’?.or it totally replaces the search function? Does the personal assistant also wait for enter to carry out commands!. Does the available commands also come as suggestions when typing, like and auto complete? I really don’t understand how everything ends up working.

    • This is only an initial idea sketch so yeah, it’s possible that something doesn’t really fit and needs tweaking, but the general workflow is the following:

      Start writing in search box
      If the text is not a command, display search results. Pressing return does nothing
      If the text is a command, and if the command is in our list of auto-execute commands show the output. Just below the output show search results. In this case pressing return might open the result in fingerterm or something, but it still does nothing meaningful

      If the text is a command, but the command is not in our whitelist, show an item with what the command does, some other relevant commands (if the search box contents are ‘call Eva’ we can show ‘sms Eva’ too, and show the search results below as usual. Pressing return should execute the command.

      The auto-execute list will contain some commands that are non-destructive and that produce results quickly enough that running them continuously after each keystroke won’t be a problem. I can think of a few commands that would make sense like that, like ls, contacts –list, calllog etc.

  3. I like the idea very much. I had setup a similar system on my jailbroken iPhone a few months ago that worked with x-callback url’s if an app had support for it. It could be very convenient but it’s quite hard to lose the habit of taping on the screen, and instead have the reflex of opening the search box and tapping the query. Although once you get used to it it really speed up things.

    I can see it integrated with the search app of Nemo, since imho it is quite useless on its own (at least when I [didn’t]used it on n9).

    I’d like a terminal like you’ve got on your screenshot with auto-complete suggestions ;-)