{"id":79,"date":"2017-11-08T00:00:37","date_gmt":"2017-11-08T00:00:37","guid":{"rendered":"http:\/\/www.matez.de\/?p=79"},"modified":"2017-11-08T14:09:32","modified_gmt":"2017-11-08T14:09:32","slug":"free-tier-hunt-howto-combine-heroku-and-openshift","status":"publish","type":"post","link":"https:\/\/www.matez.de\/index.php\/2017\/11\/08\/free-tier-hunt-howto-combine-heroku-and-openshift\/","title":{"rendered":"Free tier hunt &#8211; howto combine heroku and openshift"},"content":{"rendered":"<h2>Background<\/h2>\n<p>Recently, RedHat <a href=\"https:\/\/blog.openshift.com\/migrate-to-v3-v2-eol\/\">shutdown its openshift platform v2<\/a>. Now only openshift 3 is available and it is based on kubernetes and docker.<\/p>\n<p>So why do I care?<\/p>\n<p>I had some JEE projects running on openshift 2 and now I was forced to migrate them to the new infrastructure. Basically my applications consist of a tomcat or wildfly server running next to a mysql database. This setup was pretty easy and I could run this at no cost as long as I could tolerate, that the cartridges would be shutdown if they idle longer then 24 hours (meaning no http request was coming to the server during this time). But ok&#8230;<\/p>\n<p>So now I had to migrate my projects following the migration guide provided by Redhat. But in parallel I was interested whether I have other options or if there are any other Paas providers giving out a free tier for personal small projects. And yes, there are plenty of them, but as I found, all have their limitations.<\/p>\n<p>AWS is for 1 year, then pay. Google cloud &#8230; cant remember what was holding me back&#8230;Oracle cloud gives a certain amount of money, but evaluation phase is 3 month max. Microsoft&#8230;really?<\/p>\n<h2>Going with Heroku, but&#8230;<\/h2>\n<p>Then I found heroku offering &#8220;free dyno&#8221;, which also idles after 30min inactivity, but I wanted to give it a try. Later I found, that if you want to use a database with heroku, the limitations on the &#8220;free&#8221; database are like 5MB or 100 rows, so even if I have only a few playaround datasets, that was too small.<\/p>\n<p>Then I had the idea of connecting two Paas providers. One giving me the application tier for free, the other one is giving me the database for free.<\/p>\n<p>I ended up looking on how to connect to a database running on openshift from outside of the docker environment. What I found is the same way an admin would connect to it, via a tunnel and\/or port forwarding.<\/p>\n<h2>Oh my god, the latency between application and database will be huge!<\/h2>\n<p>Yes, that is possible, but as I do not want to propose this setup for an enterprise application running in production, I am fine.<\/p>\n<p>So here is my solution on how to connect from an application running in a heroku dyno to a mysql database running in openshift.<\/p>\n<p>Heroku provides a mechanism which allows you to pack anything into your dyno, which you need to run your application. So if you need java, then the buildpack &#8220;heroku\/java&#8221; is for you. If you need node, there is a buildpack for node and so on. The nice thing about the buildbacks is, that you can also create them on your own, using the <a href=\"https:\/\/devcenter.heroku.com\/articles\/buildpack-api\">buildback API<\/a><\/p>\n<p>A buildpack can contain a shell script (profile.d), which is executed during startup of the container. The perfect way to create a tunnel and provide access to my remote database.\u00a0You can find the buildpack at\u00a0<a href=\"https:\/\/github.com\/mwiede\/heroku-buildpack-oc\">https:\/\/github.com\/mwiede\/heroku-buildpack-oc<\/a><\/p>\n<p>So here is how you can create a heroku application having access to a remote database:<\/p>\n<ol>\n<li>install Heroku CLI<\/li>\n<li>create an app<\/li>\n<li>add my buildpack\n<pre><span class=\"pl-s1\">heroku buildpacks:add https:\/\/github.com\/mwiede\/heroku-buildpack-oc<\/span><\/pre>\n<\/li>\n<li>configure environment variables\n<pre>$ <span class=\"pl-s1\">heroku config:set OC_LOGIN_ENDPOINT=https:\/\/api.starter-ca-central-1.openshift.com <\/span>\r\n$ <span class=\"pl-s1\">heroku config:set OC_LOGIN_TOKEN=askdjalskdj <\/span>\r\n$ <span class=\"pl-s1\">heroku config:set OC_POD_NAME=mysql-1-weuoi <\/span>\r\n$ <span class=\"pl-s1\">heroku config:set OC_LOCAL_PORT=3306 <\/span>\r\n$ <span class=\"pl-s1\">heroku config:set OC_REMOTE_PORT=3306<\/span><\/pre>\n<\/li>\n<li>deploy the app<\/li>\n<li>look for the logs, whether connection works properly.<\/li>\n<\/ol>\n<h2>Advanced usage<\/h2>\n<p>The profile.d script contains a loop so whenever the connection of the tunnel shuts down, it tries to open it up again.<\/p>\n<p>From the perspective of openshift, the database runs in a so called pods and unfortunely it&#8217;s name can change.<\/p>\n<p>I tried to make this as robust as possible, so the name in\u00a0<span class=\"pl-s1\">OC_POD_NAME should only contain a prefix of how the pod is named, for instance &#8220;mysql&#8221; is enough to detect the right one.<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Background Recently, RedHat shutdown its openshift platform v2. Now only openshift 3 is available and it is based on kubernetes and docker. So why do I care? I had some JEE projects running on openshift 2 and now I was forced to migrate them to the new infrastructure. Basically my applications consist of a tomcat &hellip; <a href=\"https:\/\/www.matez.de\/index.php\/2017\/11\/08\/free-tier-hunt-howto-combine-heroku-and-openshift\/\" class=\"more-link\">Continue reading<span class=\"screen-reader-text\"> &#8220;Free tier hunt &#8211; howto combine heroku and openshift&#8221;<\/span><\/a><\/p>\n","protected":false},"author":1,"featured_media":0,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":[],"categories":[1],"tags":[16,15,12,11,13,14],"_links":{"self":[{"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/posts\/79"}],"collection":[{"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/comments?post=79"}],"version-history":[{"count":6,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/posts\/79\/revisions"}],"predecessor-version":[{"id":94,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/posts\/79\/revisions\/94"}],"wp:attachment":[{"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/media?parent=79"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/categories?post=79"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/www.matez.de\/index.php\/wp-json\/wp\/v2\/tags?post=79"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}