Summary
Server administration is an activity that often happens in an isolated context in a terminal. ChatOps is a way of bringing that work into a shared environment and unlocking more collaboration. This week Jacob Tomlinson talks about the work he has done on opsdroid, a new bot framework targeted at tying together the various services and environments that modern production systems rely on.
Preface
- Hello and welcome to Podcast.__init__, the podcast about Python and the people who make it great.
- I would like to thank everyone who supports us on Patreon. Your contributions help to make the show sustainable.
- When you’re ready to launch your next project you’ll need somewhere to deploy it. Check out Linode at www.podastinit.com/linode?utm_source=rss&utm_medium=rss and get a $20 credit to try out their fast and reliable Linux virtual servers for running your awesome app.
- Need to learn more about how to scale your apps or learn new techniques for building them? Pluralsight has the training and mentoring you need to level up your skills. Go to www.pythonpodcast.com/pluralsight?utm_source=rss&utm_medium=rss to start your free trial today.
- Visit the site to subscribe to the show, sign up for the newsletter, read the show notes, and get in touch.
- To help other people find the show please leave a review on iTunes, or Google Play Music, tell your friends and co-workers, and share it on social media.
- If you work with data for your job or want to learn more about how open source is powering the latest innovations in data science then make your way to the Open Data Science Conference, happening in London in October and San Francisco in November. Follow the links in the show notes to register and help support the show in the process.
- Your host as usual is Tobias Macey and today I’m interviewing Jacob Tomlinson about opsdroid
Interview
- Introductions
- How did you get introduced to Python?
- What is opsdroid and what was the problem that you were trying to solve when you started the project?
- What led you to choose Python as the language for implementing opsdroid?
- What did you find lacking in the multitude of other chat bots that necessitated starting a new project? (e.g. Hubot, Errbot, Lita)
- One of the main features that you list in the documentation is the ease of installation. Why is that such an important aspect of the project and how is that implemented?
- What has been the most interesting and the most challenging aspect of implementing opsdroid?
- On the opsdroid organisation on GitHub there are many repositories for plugin modules. Do you see this being a management issue in the long term?
- How is opsdroid architected and what were the system requirements that led to the current system design?
- How do you manage authorization and authentication for performing commands against your production infrastructure in a group chat environment?
- What are some of the other security implications that users should be aware of when deploying a bot for interfacing with their deployment environment?
- How does a chat-oriented bot framework differ from those that are being created for voice-oriented interaction?
- What do you have planned for the future of opsdroid?
Keep In Touch
- Website
- @_JacobTomlinson on Twitter
- jacobtomlinson on GitHub
Picks
- Tobias
- Jacob
Links
- Iron Man Movie
- Puppet
- Hubot
- ChatOps
- asyncio
- Home Assistant
- api.ai
- Luis
- Lex
- Slack
- Mycroft
- Kalliope
- Amazon Alexa
- opsdroid audio
- Snowboy
- Google Home
- Wit.ai
The intro and outro music is from Requiem for a Fish The Freak Fandango Orchestra / CC BY-SA
Hello, and welcome to podcast dot in it, the podcast about Python and the people who make it great. I would like to thank everyone who supports us on Patreon. Your contributions helped to make the show sustainable. When you're ready to launch your next project, you'll need somewhere to deploy it. So you should check out linode at www.podcastinit.com/linode and get a $20 credit to try out their fast and reliable Linux virtual servers for running your next app. Do you need to learn more about how to scale your apps or learn new techniques for building them? Pluralsight has the training and mentoring you need to level up your skills. Go to www.podcastinit.com/pluralsite to start your free trial today.
You can visit the site to subscribe to the show, sign up for the newsletter, read the show notes, get in touch, and support the show. To help other people find the show, please leave a review on Itunes or Google Play Music, tell your friends and coworkers, and share it on social media. If you work with data for your job or want to learn more about how open source is powering the latest innovations in data science, then make your way to the Open Data Science Conference happening in London in October and San Francisco in November. Follow the links in the show notes to register. Your host as usual is Tobias Macy. And today, I'm interviewing Jacob Tomlinson about Ops droid. So Jacob, could you please introduce yourself?
[00:01:23] Unknown:
Hi. Yes. So, my name is Jacob Tomlinson. My my background is in web development, web design, and then, more recently, kind of Linux system administration, and cloud architecture.
[00:01:36] Unknown:
And how did you find yourself making the transition into systems administration from doing more, web oriented work?
[00:01:44] Unknown:
So I I spent, only about a year doing kind of web development full time. And, you know, I was biting PHP applications with, my SQL databases and things. And I I just kept finding myself enjoying, building the servers and getting all the software installed more than actually building the websites. And so, after a little while, I decided that I thought that I prefer to transition into doing just the just the server stuff more. So I started looking for for position positions in that.
[00:02:15] Unknown:
Yeah. And with the recent movement of DevOps, the actual administration of the systems is becoming more of a software engineering discipline in and of itself anyway. So it's definitely a lot of interesting challenges to tackle in that regard as well.
[00:02:29] Unknown:
Yeah. Definitely. I mean, I found myself quite, the the people that I was working with in the system administration side, a lot of them had come from either Microsoft system administration or kind of straight in system administration from a more pure computer science point of view. So me coming from a web development background was quite useful because I would kind of understood the developer side a bit more than the kind of operation side. And that was that was really useful, to kind of communicate between the different areas. And you said that when you're doing the web work, you were primarily working in PHP. So do you remember how you first got introduced to Python? Yeah. Python's a funny 1 for me. I feel like I've learned Python probably about 3 times now, 4 times maybe. The first time was at school, kind of it was the standard thing that that people were suggesting to learn.
It was before Python was added to the kind of actual curriculum here in England but it was 1 of those things I learned it a little bit as I was getting into, kind of, computers in general but then I didn't really do much with it for a couple of years. And then when I was further on in in education, there was, we had 1 module in 1 of our courses, which was Python. So kind of that was nice to revisit and and have a look at Python again. Although, frustratingly, from from then on, they switched us on to visual basic, which wasn't necessarily, ideal for what what I was looking at doing.
So it kind of when we moved to Visual Basic, I started looking at PHP because I was more interested in kind of web back end stuff, and that seemed to be the the language of choice then. And it wasn't until I kind of got more into the system administration stuff and saw that Python was really useful for for sysadmin work that I I kind of sat down and thought, well, I'm gonna properly learn Python now and and really get to grips with it. So I kind of went through a big phase of of relearning. And then now these days, it seems to be the language I use more than anything.
[00:04:18] Unknown:
And are you using it primarily for writing 1 off scripts or general tooling or are you also using it in the context of 1 of the configuration management frameworks?
[00:04:29] Unknown:
So it's mainly in kind of 1 off scripts and and tooling. Weirdly, we a lot of the configuration management I've done is in puppet. So that, you know, Ruby world. But my Ruby is not as anywhere near as polished as my as my Python, but I tend to find myself writing a lot of Python for kind of administering thing on Amazon Web Services and and using different APIs, you know, controlling Kubernetes clusters with that API and and those those kind of tasks. I also find myself whenever I need to write like a little restful API to do this that or the other. I end up just writing that in Python or something like Flask or or Django.
[00:05:06] Unknown:
So wondering if you can briefly describe what Obstroid is and the problem that you were trying to solve when you first started the project.
[00:05:13] Unknown:
Yeah. So Obscroids kind of grown out of partly just an obsession with chat bots. I guess, I got really interested just after watching the first Ironman film when that came out and, you know, Jarvis. And I think that's if you talk to a lot of people that are quite into chatbots, that seems to be 1 of the main examples that they say. And I was always quite interested in writing something along those lines. And kind of within the team that I work in now, we're we wanted some ways of automating different workloads and and and things. And I I was spending a lot of time reading about the way that GitHub works internally and saw that they were using, you know, what they called to be chat ops.
And they were using Qbot was, you know, the framework that they've they've written in CoffeeScript, for kind of automating a lot of their workflows. We used that for a little while, but I found myself getting getting a bit frustrated with it. And so I decided that I'd have a go at writing my own own framework to try and solve that that same kind of problem. I primarily chose to do it in Python because there was a specific library that I wanted to learn, which is the new asyncio stuff that came out in Python 3.4, and kind of polished in Python 3.5, And so I saw it as a really good opportunity to to really get stuck in with that that library because it was quite useful in the context of what what Upstray does. So the idea behind the behind the module is it's meant to just be like a glue between other services, that can kind of link up natural language, with actually running some pipe and code to do a thing. So the idea of Ops droid is you run it on a server somewhere, you install some modules for your chat clients, whether that's Slack or Facebook Messenger or Twitter or or however you want to interact with it. You have some modules for storing stuff in databases. So if you need the bot to be able to remember anything or kind of persisting state of of a conversation, then that will get stored in different databases. And then there's modules that, that I've called skills in the in the framework, which are just different Python functions that can do a specific thing.
So whether that's starting servers or stopping servers or checking the status of, of a pull request from GitHub or, you know, just injecting memes and other funny things into the conversation. Obstroid is kind of the process that sits in the middle and that brokers all the different different services that it needs in order to make that happen.
[00:07:27] Unknown:
And you mentioned that when you first started working with, chat ops in general, you were trying out HuBot from GitHub and you mentioned that you were starting to get frustrated with that. So wondering what were some of the frustrations that you were seeing? And I'm also wondering if you looked at any of the other bot frameworks before you decided to implement ops droid such as Airbot or Alita?
[00:07:48] Unknown:
Yeah. So some of the frustrations I was having with Hubert was partly around the language. So Hubert's written in CoffeeScript, which, you know, some people seem to like CoffeeScript, some people seem to really hate it, I'm kind of on the fence personally about it. But 1 of the things we found from that because copy script is like, you know, it's a derivative of JavaScript and you have to almost like compile it into JavaScript in order to get it to run. You can write skills for Hubert in either copy script or JavaScript. And it meant that there's quite a big fragmentation in the documentation around different projects. And if you have a look around on GitHub for all sorts of things people have written, you know, some things will be written in copy script, some things will be written in JavaScript, the documentation is a bit split on on, you know, I think GitHub try and push you towards their copy script side of things.
But it just it found I found debugging quite tricky, because of that. Some of the other things that we were finding with it was just actually installing it was fine, but installing additional libraries, it required you to either dump files in a in a directory or push stuff into into NPM. And again, that that's fine, but because I work primarily in Python, I was just finding it a bit frustrating having to switch out to a different different language to do all those different things in. And then 1 of the last things that I found frustrating was the fact that you can only hook up Hubert to 1 chat service at a time. If you wanna hook it up to multiple chat services, you have to run multiple instances of Hubert with kind of duplicated configurations and skills and things. And so, yeah, I was primarily using Slack to interact with the bot, but there were a few situations where it's like, well, I wanted to hook it up to to some other some other clients and and, you know, maybe hook it up to Twitter as well to control a Twitter account or something something along those lines. And I was interested to see if I could write something that could do all of those in in 1 go. The other kind of thing that that I was finding frustrated was the the kind of limitations on how you could match the language to a skill. So Q bot is all about regular expressions and matching regular expressions to different skills. And although Optroid does that, Optrade's got a whole suite of of other ways of matching, different different things. And kind of getting it Hubert to do anything other than regular expressions, was just a bit it felt a bit hacky.
So I started having a look around for an alternative, and I did come across, some of the frameworks like, dot at Lyta. I kind of, it's not a great reason, but I just dismissed Lyta because it was in Ruby, and I would I prefer to work with something that was in the language that I knew quite well already. So I was kind of considering Python or JavaScript frameworks. My PHP days are kind of behind me. I'd rather not dip too much back into that. So I did have a look at Urbot and there were a couple of others that I I considered that I can't remember the name of now. But Urbot felt very much like it was almost like a command line within a chat client. So it was very much about, you know, you start a command with an exclamation mark and then a short command word and then some arguments. And and although I'm you know, I prefer to use the terminal over over Guis often, I didn't I didn't think it would be an easy sell to the rest of my team and my coworkers to to kind of say, oh, you now have to use the command line to do this, but you have to use it in a chat client where you don't get any auto completion or anything along those lines. So again, I was I was interested in this asyncio library, and I thought this is an interesting project, and so I thought I'll have a go at writing my own.
[00:10:57] Unknown:
And what is it about the problem domain that you think lends itself well to using async IO and just a asynchrony in general?
[00:11:06] Unknown:
So I find that every kind of interaction that you're gonna have with the bot is gonna be an event driven thing. You're gonna have events coming in from different places. It's either gonna be coming in from your chat client, it's gonna be coming in from a third party API service. Obstruct can also do kind of cron jobs and those kind of things. So these are kind of treated as events, coming in from a different place. And so you want these things to happen concurrently and kind of running on an event loop and things. And and, you know, I've been looking at libraries like Tornado, and stuff like that. But it was actually the the home assistant project had just switched out their call from kind of using the standard Python threading libraries to using async IO, and I just got really interested in that library and thought, oh, that's, you know, their their application is also very event driven, and so I thought that that library sounds like it lends itself really well to this this problem.
[00:11:52] Unknown:
And before we go too much further, I'd like to take a bit of a diversion and just have you define how you think of the term chat ops and why you think it's a valuable approach to doing development and systems administration?
[00:12:08] Unknown:
Yeah. Sure. So, I did actually I found a definition I quite liked for for chat ops. I ended up putting, at the top of the readme on the project. And the the definition, kind of covers using a chat client to do operational work that you very often within a team of administrators, you do on your own. You don't necessarily share that with people. You know, there's there's very often like if something goes wrong and you need to fix it, it's a very insular kind of task that you have to do. You do it on your own. You do it quietly. You might document it. You might not document it. You might write some scripts to do it, and you might share those scripts with your team, but you also might not. So the idea of chat ops is moving all of that stuff out of kind of your own personal workspace into somewhere that's like a group group location. So doing these kind of operations within a group chat within some kind of messaging client, it means that everybody else in the team has visibility over what you've done. They can see what what's happening. If there's some kind of large incident happening and multiple people are trying to solve a problem, you can kind of work together within the within the chat client to do that. And it also means you've got, like, a record of of what's happened. So if something's happened in the middle of the night and somebody's on call and they've they've resolved the issue, the the rest of the team can come in in the morning, and then go back through the chat log and see what's happened, see what alerts are fired, and what's been done to try and resolve those problems.
[00:13:26] Unknown:
Yeah. I definitely think that the shared context when there is an incident that you're trying to resolve is probably 1 of the more powerful aspects of chat ops because, like you said, it does bring more visibility and you don't have to have that additional synchronization point of people asking, oh, what commands are you running or what's the state of the system? They can just look at the chat log and automatically understand what's happening, what's going on, and where the current state of resolution happens to be without having to maintain an additional communication channel that's going to bring you out of the flow of trying to actually solve the problem.
[00:13:58] Unknown:
Yeah. Exactly. So it's also I find it quite a nice place to consolidate a lot of different resources. So, you know, it's it's all too easy to have cronjobs that kind of do different bits of housekeeping or that that kind of thing. And they can quite easily once you've got a setup of lots of servers, if you're not strict with how you you set things up, you can end up with all sorts of Chrome jobs under all sorts of different users running all over the place. Whereas having it in kind of a chat ops context at a bot like ops droid, It's all contained in 1 1 place and everybody can see that and they can see when they're running and and see exactly what's going on. And the other thing is is all of the scripts that you've got for automating all the workflows, again, they all kept, kept together. So whether you're writing, you know, actual Python scripts to do these things within Opstroid or whether you're getting Opstroid to shell out to Ansible or or Puppet or some other application for applying different configs. You're you're still collecting all these together and and keeping it all in a in a big bundle. So everybody has visibility over the capabilities and also what's being done.
[00:14:55] Unknown:
And you touched briefly there on 1 of the things that I wanted to discuss as well, which is integrating chat ops with an existing configuration management solution. So I'm wondering if you can just talk about how Ops Droid can be used in conjunction with things like Ansible or Puppet or SaltStack.
[00:15:13] Unknown:
Sure. So I've tried to keep the the idea behind writing skills for Opstroid really really simple. Like an individual skill for is a Python function. You know, you use decorators to match those functions up to the actual, like, language that's being used in the chat client. But it is just a function of pure Python. And so that can do whatever you whatever you like. So, you know, if you want to call out to a service like Ansible, you could just literally, you know, use the sub process and and open a new process and and get that to run, and then maybe report the status of that command back into the chat client. Or you might want to, you know, if if something you're using for configuration has an API, like a restful API then you can of course use all the the Python tooling for interacting with an API and kind of making logical decisions about oh, this has happened. So should I run this or should I run this? But all the time, it can be done in a dialogue, with the bot and kind of, you can have a back and forth about what should be done in different situations if there's no kind of sensible default set or anything like that. So it doesn't necessarily give you tons of help, with exactly how you should do something, but it it just gives you the freedom and the flexibility to work with whatever you're whatever you're happy with.
[00:16:25] Unknown:
And 1 of the other aspects of chat ops that you brought up is the idea of accessibility across the team, particularly given the fact that as you said, a number of the other bot frameworks require learning a particular syntax for being able to interact with the service. But you mentioned that you wanted to allow for a more natural language style interaction. And I also noticed in the documentation that you brought that accessibility idea problem of installation as well, which I know is 1 of the problems that I've run into with some of the other frameworks. So I'm wondering if you can talk a bit about why that's such an important aspect of the project and what you did to enable that ease of installation and maintenance?
[00:17:09] Unknown:
Yeah. Sure. So, I I really wanted it to be simple in terms of, I wanted you to be able to just install up droid in kind of 1 go without having to worry about, you know, too many dependencies and things. And it's been a little bit tricky because of the decision that I made to to go with the asyncio library because that, and also the the way I used it because I basically set a requirement for Python 3.5 or Python 3.6. That's kind of, you have to have 1 of those, on your system to use up steroids. So that can make things a little tricky. If you look at the documentation, it just recommends that you use a pip 3 install Obstroid, and that would just put Obstroid on your system and it'll also give you, a shell command literally just up straight that you can run. And the the way that you can figure up straight, which seems to differ from from other libraries is there's just a single configuration file. And in there, it's a yammer file and you list all the different modules that you want, all the different configuration options that you want.
Obstrator will try and give you like a a sensible default when you first install it. So to get started, it's literally 2 commands. You just install Optroid or run Optroid, and it will give you like a a basic bot. But then you can go into that YAML file and you can just edit it and say, I want to use Slack. Here's my Slack API key. I want to install these skills, and it's quite quite straightforward, in that sense. And also with installing the modules as well. So as I said, there's there's 3 types of modules. They're all treated as Python modules. Kind of, they're all like imported. You when you write them, you can either write them as an individual file or as you know, a directory with an underscore underscore in it, underscore underscore in there. And so it should feel quite natural to people that are used to writing Python. But when you configure things up, you you have a list in the yamil file of the different modules that you want. And and 1 of the, the key value pairs that you have to have in each each thing is just a name.
And and that name, if you don't give it any more information than that, it assumes that there is a module that already exists with that name in the Ops Droid repository on GitHub. And so it immediately goes away, pulls down that repository, and and stores it into a into a virtual end, for you to to then for up straight to import into the project. So to get going, if you just want to use off the shelf libraries, you literally just specify a list of names, and that will install those modules and then you can get going. But then if you want to create your own skills, you can you give it a name, but then you also either give it give it a path to your module or, a git URL to either GitHub or locally, and it will it will then, move everything around and get it into the right place for you. But I I kinda wanted to just keep it really, really simple for onboarding people. I just wanted people to just run 2 commands and then they're they're away.
[00:19:43] Unknown:
Yeah. I definitely appreciate that because as I said, some of the other frameworks that I've looked at implementing, the install process is a lot more involved. It requires setting up a full server with associated configuration of the instance and the, service and writing code to be able to actually get everything connected. So being able to just PIP install, drop a config file, and have something working is definitely a much cleaner interface and much lower requirement for being able to get up and started. So I think it's definitely much more likely to drive adoption of the framework in general. So what's been the most interesting and most challenging aspect of implementing Obstroid in the process of trying to get it up and running and get it distributable?
[00:20:26] Unknown:
So I think some of the some of the challenges I found is around kind of making it as flexible as possible. You know, the the more flexible you make something, then the more more of a kind of management and development overhead that you end up with. And and I've always wanted this framework to to literally just be glue between any services that that apply, you know, that are applicable to this problem. I just want somebody, either myself or or a third party to con contributors to just come in and add that capability to Obscroid. So some of the the interesting things that I've been trying to tackle is how to connect other kind of natural language understanding services up to up to up to up to droid. So, you know, out of the box you get some standard stuff like matching against regular expressions or, triggering events off a cron job or, you can also have webhooks as well. So, will expose a restful API and you can, if you hit an endpoint, it will call a call a skill, in the same way that you see you saying something to it in a in a messaging window will. But I've also hooked up just initially, I've hooked up API AI, which is an API for doing kind of intent matching and natural language understanding. That's it gives you a lot more over regular expressions.
You go into API AI, you create an account, and then you create an intent. So your intent will be, you know, something that you want to happen, restart our server, you know, restart server name here. You give it some examples of different kind of permutations of that sentence. So, like, restart the server. Could you re please restart this box? I want to restart the server. You know, a few different things. And then they, use some kind of statistical matching, to kind of fill in all the blanks. So if a user says something which is pretty close to that but not exactly what what you've said, it will give you like an accuracy score. It will say I'm 90% sure that this is actually this sentence matches to this intent. And And then you can configure up straight and say, if it's more than 70% sure, then then run, you know, run it. And so hooking up to that service has been been really interesting. And I've definitely got some plans in the roadmap to hook up to other services like Lewis and, Lex and and somebody other kind of natural language understanding APIs that exist. But they all have slightly different ideas around how to handle context.
So, you know, how to manage when if a bot asks a user like a yes or no question, and you, you know, the user will just say yes or no. And so the bot will receive that yes or no out of context. And it has to try and figure out kind of what's happened previously to understand what what is going on with with that. And that's quite a big area of research. It seems within different bot frameworks. So actually allowing up to handle context, but also fit in with these these different services that all are trying to apply context differently. It's it's quite a tricky challenge. And I I definitely haven't gone to the bottom of it yet, but it's it's something that I'm really interested in working on in the next few months.
[00:23:18] Unknown:
I also imagine that the fact that the entire core of the framework is asynchronous and you may be receiving interaction from multiple different sources simultaneously drastically confounds the problem of trying to maintain that context because it's not as simple as saying, okay. What was the last command that I received? That's my context. You have to be able to associate the particular channel that it's coming from and the last potentially series of messages, you know, maybe match it up to a particular user that their last command was from to be able to match up that context.
[00:23:49] Unknown:
Yeah. Exactly. Exactly. You kinda gotta maintain kind of sessions, and they kinda gotta make decisions about, like, well, if if I have maybe a 10 or 20 message exchange with the bot, and then I go away for 40 minutes and I come back and continue to talk to the bot. Is that a new session or is that kind of moving on to a new conversation? And the bot's gotta make some decisions about what, you know, how it honors different bits of context and how it hangs on or releases different different bits of information.
[00:24:16] Unknown:
And in the context to of session management, 1 of the issues, particularly when you're talking about chat ops in general, is the idea of security. So I'm wondering if you have any mechanisms in place or at least some plans for implementing an authorization and authentication capability so that you can ensure that you don't have some bad actor or somebody who's not authorized to perform a certain command either purposefully or accidentally running modifications to your production system via the chatbot?
[00:24:50] Unknown:
Yeah. It's it's definitely an interesting challenge. I think in Oxford, I'm trying to take the approach of handing that responsibility off if I can, onto onto another service. So whether that's the chat service itself or or, you know, some kind of other other way of of dealing with it. But, you know, because Optroid is so flexible, it perfectly allows you to install Optroid, install a skill that allows you to stop and start production instances on Amazon, and then hook that straight up to Twitter and let people on the Internet just, you know, do whatever they want to your infrastructure. Like, you have the the power to configure it badly and and end up in a situation where where you're kind of leaking different bits of different bits of permissions and kind of go back to something I was talking about before to do with connecting to multiple services.
Like, it's 1 of the things that I was initially interested in with Ops droid is being able to connect to multiple services. So can we have a bot sitting in our Slack for internal use? And then can we have a bot hooked up to Twitter to kind of manage our Twitter account and auto respond and and kind of do some interesting stuff on Twitter? And I spent a lot of time thinking about how I would handle the permissions between, you know, we want people in the Slack channel to be able to run certain skills, but we don't want the people through the Twitter channel to be able to run them if it's to do with infrastructure. And I I feel like I haven't I haven't decided well enough how to to implement that and how to, you know, keep it really really secure. And so I just I haven't I haven't crossed that that line yet. So it's it's still just kind of a free for all at the moment. It's definitely on the road map in the future for kind of adding some some kind of role based access control or something along those lines.
But I want to to federate the authentication out to other services. So whether that, you know, you have a set of skills which are for rebooting production instances, and you have some skills that can reboot development instances. And maybe those skills will only apply specific channels on Slack. And then it's kind of the the onus is on whoever is managing the Slack account to make sure that people only only authenticated people can be in different channels, and and only the right people have access to the right the right services through that. Because because all the skills in up straight are really flexible and and you can write them however you like. You can install the same skill over and over with different config options. So you might have a skill for rebooting different instances. And the the config for that might have API keys that can only access development instances. And another skill might have API keys that can only access production instances. And then you can lock those skills down into the specific channels that you want. So you might have different levels of of permission kind of lined up with different groups within your within your chat client. But I still need to think about how that would really work on the back end and how, you know, you could be really certain that you your permissions wouldn't leak across between between the 2 or that there weren't any bugs that would allow people to to escalate, you know. There's there's no such thing as a perfect bug free secure piece of software, but, I think I that's an area where I need to work on the maturity of the of the project to really get that up to up to speed. So in terms of the way we use it at the moment, we just, make the assumption that the bot can do a certain number of things. We're quite, you know, when we get the bot API keys to do things, we make sure that those keys can only do the things that that we we want the bot to be able to do. So if it's rebooting an instance, it can only reboot an instance. It can't also launch instances just in case somebody manages to get those keys out in any way. But then we we we just make the assumption that anybody that can access the Slack channel that the bot is in can do these things. And if we want to hook the bot up to Twitter, at the moment, I'm just I'm using a separate deployment of Oxbridge, to hook up to that so that it's not got those skills available, all those keys available.
But that kind of goes against the principle of what I was trying to achieve initially. So it's definitely something I wanna work on more.
[00:28:31] Unknown:
Yeah. As you describe the problem domain and the sort of capabilities based permissions, it puts me in mind of an interview that I did a few weeks ago with, the author of a framework called YoSai. So I don't know if that might be worth taking a look at as you continue to think about the security capabilities of the system.
[00:28:48] Unknown:
Yeah. Definitely. I'm I'm halfway through listening to that, so actually, so I shall continue.
[00:28:54] Unknown:
So in addition to permissions management and access control to the different systems, I'm wondering what are some of the other implications that users should be aware of when they're deploying a bot for interfacing with their production environments and the other third party services that they might have subscriptions with or want to be able to interact with from a chat framework?
[00:29:14] Unknown:
I guess a lot of it is just around making sure that whatever permissions that you allow the bot to have, you're comfortable that that maybe there's some kind of throttling or some kind of limiting within the skill. So that maybe there's some kind of throttling or some kind of limiting within the skill. So I guess it it a lot of the responsibility for ensuring that skills are secure are down to, you know, the users for if if you want to write your own skills for Obscroid, kind of there's a Obscroid will give you a certain amount of help, within the framework, but it's down to you to write skills that are secure. And so it's just, it's 1 of those things that that maybe that it would be a good idea to have put out some guidance on and and some kind of assistance with, but it's it's something that's also slightly out of my control. So it's just trying to to ensure people are using best practices.
[00:30:05] Unknown:
And as I was preparing for this interview, 1 of the questions that came to mind is what do you see as the sort of dividing line between a chat oriented, primarily text interface, bot framework, and those that are created for voice interactions such as Mycroft or Calliope or the, you know, Amazon or Google virtual assistant tools. And wondering if you see any potential, synchronization or integration points between those 2 different classes of, sort of automation.
[00:30:38] Unknown:
Yeah. It's definitely a really interesting kind of thing to think about because there's just been an explosion of different chat frameworks both for the, you know, the text based chat and and voice chat over the last few years. You know, Amazon have kind of led the charge with opening different things up to developers and allowing developers to write skills for Alexa. And a lot of people are now following suit and writing writing different third party libraries. So 1 of the things if you if you have a look at the Upstart project, there is a a project that I worked on. It was literally just for a weekend. I was just curious to see if I could add it, which was adding voice capabilities to up straight. So the idea was just that you'd install up straight and also this this project up straight audio. You just put them on a Raspberry Pi with a microphone and it uses, the the Snowboy hot web detector. So it's a nice, open source, Python extension, which, it just uses the microphone to listen for a specific word. And when it hears that word, it then calls another function. So I then wrap that up to call out to Optroid. So as well as being able to hook up to Slack and and different services like that, Optroid can just expose a WebSocket API, and then you can write your own chat client to connect to So I was exploring with audio how to how to do that. And it it felt very similar in in the way that I was interacting with that bot through speaking to it as as I went through the chat client. But I guess 1 thing that's kind of you think about a lot more in in text based bots is multi user, especially within a single conversation. So the majority of the time when speaking to the bot, I'm in like a group chat where there's multiple people in my in my team kind of all in that in that same room. And we might be having a conversation with it where like I'll say a thing and the bot will respond and then somebody else will say a thing and the bot will respond to that. And there's a lot of kind of thinking about how to handle the context between that, and how to, you know, does the context of 1 user apply to the the other user if you're if you're having like a multi way conversation. And I I don't think some of the kind of audio frameworks like Alexa are thinking about that. I know that Microsoft have got an API for picking out different voices within within an audio stream, and they're providing that as a as a service or as you so people are obviously thinking about these things, but I think that's that's a big difference is I found it also from from using services like API, AI and things like that. It it assumes that the bot has to respond to everything always because it's assuming that you're having a 1 to 1 conversation. You know, if I ask Alexa to do something, it would be bad if Alexa just ignored me and didn't respond. If it didn't think that that was applicable, it'll probably come back and say something like, I don't know what you mean or what do you want me to do or here's some suggestions of different things that I can do. Whereas when you've got a bot in like a multi user room, you don't want the bot to be responding to every everything everybody is saying. You know, if I'm having a conversation with a colleague, I want the bot to kind of stay out of that conversation unless something is said that is relevant to the bot, then you want the bot to chip in. And that's that's definitely a tricky problem, and it's definitely not polished. Our bot definitely interrupts us with all sorts of random stuff, occasionally, which can get frustrating. But it's it's it's a totally different paradigm. So kind of a 1 to 1 conversation with with a device like an echo.
[00:33:36] Unknown:
And also particularly given the fact that you mentioned home assistant earlier in the conversation, I'm curious, 1, why you didn't decide to just try and implement Obstroid on top of the existing async IO loop that home assistant uses and alternately wondering if you have considered using some sort of bridge to be able to integrate obstroid with home assistant so that you can have a text interface to the broader home automation capabilities that home assistant brings along.
[00:34:08] Unknown:
Yeah. Definitely. I did consider it initially as kind of writing it as an extension to Home Assistant. I think Home Assistant has to be 1 of my favorite open source projects. It's got a really nice community and it's it's a really well written application. But I think 1 of the kind of deciding factors in in making up straight a separate thing was around the the target audience I have for this. So, you know, I wanted sysadmins and and devops engineers to be installing up straight and using it within their workflow. And I felt like saying to them, oh, by the way, you have to install this home automation framework first. Because, you know, you look at home home assistant and you immediately think, oh, that's really cool. I can turn off my, you know, turn my lights off and and do all sorts of automatic things. But you you wouldn't necessarily make the leap to it being useful in in system administration. Although I've I've definitely seen examples of people using it that way. So I thought, you know, I'll make up straight a separate thing, but I would really like to tie in some some integration between the 2 at some point. I have been thinking more recently about writing a skill that kind of looks up all the available devices within within home assistant and then adds a load of intents around switch off the lights, switch on the light, all those those kind of things. But also, home assistants, sorry, have been adding lots of voice control things to their framework recently. The last couple of releases have included a new, intense component. And their stuff is very aimed at hooking in with the Google Home or the the Alexa ecosystem.
But it's it's really interesting to see how they're developing that. And so I've I've decided for the time being, I'm just gonna hold off until what they're working on matures a little bit, and then sit down and figure out how to really well into up straight into into the home assistant ecosystem. Because yeah, I would really like to be able to just grab my phone and open up Facebook Messenger and talk to my home assistant machine and turn off different lights. Especially when I'm not inside my property, because I'm I'm still a little bit cautious about actually exposing my home assistant to the internet. I don't really want people to be able to log in, you know, log in to my home assistant and see the status of things. Whereas if I can use a chat client that only, you know, that that's Facebook or authenticating or somebody like that, then I feel a bit more comfortable about interacting with with my home setup over that way.
[00:36:13] Unknown:
And what are some of the other features or capabilities that you have planned for the future of Opstroid?
[00:36:20] Unknown:
So kind of to to recover what I was talking about before is adding extra natural language understanding services. So as I said, I I only support API AI at the moment. But I definitely want to add support for Lex and Lewis and Wirt and and some of the other services like that. I'm really keen to add more connectors at the moment. So the at the moment, I primarily use it through Slack, although there is a Facebook Messenger and a Twitter and a Telegram client, a connector for it. But obviously, there's there's loads of other chat services that can be hooked up. So all those modules definitely are gonna come in at some point.
But in terms of storing context and state, you can put stuff into MongoDB or Redis at the moment, but there's no reason why it couldn't go to my sequel database or, you know, whatever you like. So there's definitely scope for adding database, modules. And then kind of just fleshing out the skills and and and giving him way more kind of out of the box default skills that people can install. That would that would definitely be really nice. And and all those kind of things are things that I've kind of got planned, but I would really like to kind of build a community around and get people contributing things that they like. You know, I I particularly like certain chat clients and I I particularly like certain databases. So I want to focus my development effort on those. But if other people have got other preferences, I want them to be able to just come along, write their own Python class.
And it will just allow up straight to then interface with those other services. The other main thing that I'm really interested in thinking about more is this whole idea of context and how you handle dialogues between between bots and users. Because a lot of the kind of chat services you see around either don't handle context very well. Things are very 1 shot questions, and it's nice to see kind of things like Alexa getting better at that. I had Alexa ask me a question the other day, in a dialogue which I I thought was quite interesting. But quite often you you're just giving a single command, you know, Alexa, I want you to play the Beatles on Spotify or whatever it is. But in terms of of actually maintaining a conversation, some of the some of the other frameworks I've seen seem to be implementing things as like a flowchart almost as a tree. It's like the user will say this, and then the bot will say this, and then the user will say 1 of these 4 things. And you know, it's almost like a big nested if statement or something. And that's great. That does seem to work quite well, but there's all sorts of issues around if if the bot gets something wrong, you're then kind of stuck in this dialogue chain Or, you know, if the user just wants to change the conversation and ask it to do something else, you can end up stuck in little loops and little feedback cycles and things. So I'm really spending a lot of time thinking about how to kind of wait different bits of context and kind of treat the conversation as more of maybe like a graph or or just kind of, you know, a place where you're jumping around all over the place.
And and adding those kind of capabilities into upstroid, maybe allowing you to mark up a skill is instead of saying match this sentence, you can say match this sentence as long as the context is this. And that context might be the bot has just asked you a question or, you know, I've just been asking the bot about a specific thing and so you can assume that I'm still talking about that thing. So it's all stuff that's kind of light up on the roadmap, but but I just, haven't got to yet.
[00:39:21] Unknown:
Yeah. And what you're saying too about the decision structures for being able to create a more interactive dialogue with the bot brings up the idea of discoverability of the potential responses or requests that you can make. Because as you mentioned, when you're in a terminal environment, you can use tab completion to say, okay, this is this is the list of options to this particular command. But if you're in a chat interface, then you have to rely on using, you know, explicit help commands and then having it post a big list of capabilities. So I'm wondering if you have any thoughts there as far as how to make the discoverability a little bit easier.
[00:39:58] Unknown:
Yeah. It's definitely a tricky problem because you you know, there's a few different options that kind of immediately spring to mind around, you know, you could start, especially in chat, it's easier. I mean, you start a conversation where the bot gives you some examples of things you can ask it when you first speak to it. Or maybe you put in some kind of help thing where you can say, you know, what can you do? And then it might give you a few examples of other things that it can do. The other thing you could possibly do is is, when you ask the bot to do something, it can do it and respond. But then also maybe give you a suggestion and say, oh, why didn't you try asking me this? Why didn't you try asking me that? But that can get a bit frustrating and a bit annoying if you're just trying to get things done. You don't want the bot constantly kind of spraying, kind of help manual type stuff at you. So it's it's definitely not a problem that I've I've managed to solve in my head yet. Something I'm still thinking about. But maybe, it might be useful to have some kind of web app or something like that. That goes hand in hand with Obscryde that allows you to just have a look through what is available or what's enabled, and and maybe just give you some suggestions that way.
[00:40:57] Unknown:
So are there any other topics that you think we should discuss before we start to close out the show?
[00:41:02] Unknown:
So I guess the only other thing that I was quite interested in talking about was kind of building a community around open source project. So, you know, we we've talked a few times in this about Home Assistant, and I think Home Assistant is a really kind of inspiring project to look at. They've got a fantastic community, loads of active contributors, active forums, people writing additional libraries and things. And that's kind of kind of the thing I've got in the back of my mind about getting getting up straight into that kind of place where there's people writing third party modules for it, people contributing to the core project. And and I'm I'm hopeful that other people will find the project as useful as I do and start start getting involved. But I'm I'm in the process at the moment of trying to think of different ways of kind of promoting the project and getting people to get on board. So I kind of just wanted to say if anybody is listening, please come and contribute to Ops droid and and get involved and get stuck in. Definitely interested in kind of incentivizing that. So I might look into something like giving away stickers to people that actively contribute to the project or or something along those lines. And just to try and kind of create a friendly friendly community around the project.
[00:42:04] Unknown:
And just the perennial reminder to people listening that contribution doesn't necessarily equal code. There are lots of ways that you can contribute to any open source project whether it be through writing software or triaging bugs or helping with documentation or just providing feedback about how you're using the project so that the project authors and maintainers can have real world examples
[00:42:26] Unknown:
of the specific needs that people have for the project. Yeah. Definitely. I mean, I I personally find when I'm working with different projects, like, and I want to install something or use something for the first time. I'll I'll Google for kind of a tutorial or a getting started guide. And you end up finding loads of them, and and quite often click through the first few and they maybe the language style doesn't isn't quite how I I like to read things or maybe they're not diving into the right things. And I'll just keep going through different blog posts until I find 1 that kind of suits what I'm doing, and then and then have a look through that. So I definitely encourage people to, you know, just just write about your use of Obscryde and how you installed it and what you're doing with it. I'll be really really excited to see things like
[00:43:03] Unknown:
that. Alright. And so for anybody who wants to get in touch with you, follow the work that you're doing, see the state of the ops droid project. I'll have you add your preferred, contact information to the show notes. Okay. Excellent. And so that all moves into the picks. And so my pick today is a podcast I just started listening to. Looks like it's, fairly early in its history, and it's called the rough translation from our, you know, you know, from current events and talk about how those same kinds of ideas are being discussed or dealt with in other contexts and cultures just to to try and bring us out of our echo chamber and see see see what the state of the world is in other areas that are outside of our immediate purview.
So I thought that was kind of an interesting goal with the podcast, and, the first episode so far was pretty interesting. And with that, I'll pass it to you. Do you have any picks for us, Jacob? Yeah. So funnily enough, I was gonna pick a podcast as well.
[00:44:07] Unknown:
So we talked a lot in this in this, podcast about Home Assistant and a few it's about a month or so ago now they started, creating a Home Assistant podcast. And they're about 4 or 5 episodes in now. And it's really interesting just hearing people talk about how they've set up home assistant, how they've installed different home automation hardware, and how they've linked that together with home assistant. It's really great for giving you ideas about what you can do in your own home. And then they also interview the core maintainers of the project and and, you know, hear about what's coming up in the road map and and things like that. So if you're kind of into home automation and into the home assistant project, the podcast is is
[00:44:44] Unknown:
Alright. So with that, I would like to thank you for taking the time out of your day to join me and tell everybody more about the work you're doing with ops droid and chat ops. It's definitely an interesting subject area and 1 that I've been keeping an eye out for potential frameworks that I can use to implement chat ops in my own work. And, Upstride is 1 of the more promising candidates. So I appreciate your work on that, and I appreciate your time. So I hope you enjoy the rest of your day. Yeah. Great. Thank you very for inviting me on. Enjoy your day too.
Introduction and Guest Introduction
Transition from Web Development to Systems Administration
Introduction to Python and Early Experiences
What is Opsdroid?
Challenges with Existing Bot Frameworks
Event-Driven Nature of Opsdroid
Defining ChatOps and Its Value
Integrating Opsdroid with Configuration Management Tools
Accessibility and Ease of Installation
Challenges and Interesting Aspects of Developing Opsdroid
Natural Language Understanding and Context Management
Security and Permissions in ChatOps
Implications of Deploying Bots in Production Environments
Text vs. Voice Interaction in Bots
Integrating Opsdroid with Home Assistant
Future Plans for Opsdroid
Building a Community Around Open Source Projects
Closing Remarks and Picks