Modern static site generation

  • There is this function called Server Side Rendering where you can produce static HTML content directly from a tree of React.js components.
  • At the time of reading this, this site will already be served by GitHub pages and the content you will see once you view the source it’s been generated by Gatsby.js.
  • Here’s my list:

    At the time that I started following them, all of them were in a very early stage and none of them could generate my site’s content the way I wanted and in the same way as my previous one so to not lose paths and certain functionalities.

  • Think of when you are caching the home page of a WordPress powered site and you are serving the cached content to every visitor.
  • Imagine that instead of having memcached caching your HTML in front of your WordPress site, you trigger a hook each time your database changes that will re-generate the frontend using Gatsby.

In this post, I will talk about static site generators. How they have evolved and why I switched from a Ghost powered site to Gatsby.js, a modern static site generator.
Continue reading “Modern static site generation”

Getting Started with Expo React Native and Styled Components using a Netflix Clone example

  • For me it opens it up in Sublime Text Editor, but since I prefer to use WebStorm, I just manually find where the project was created and open the project using WebStorm.Your initial project structure should look something like this:Lets create the basic outline of this project, I like to structure my project by putting everything in a ‘src’ folder and then splitting it up from there.Create a ‘src’ folder and create a ‘components’ directory with a ‘common’, ‘navigation’, and ‘screens’ subdirectories.
  • I also use this file to store any other common styling attributes such as font sizes, and padding/margin lengths.You will also need to grab some images and put them in your /assets directory or you can use the ones that I picked out in here: grab the Netflix logo icon and put it under assets/icons directory: project directory structure should look something like this:I like to make my imports look clean, so I would recommend that you install the babel plugin `babel-plugin-module-alias`.
  • babelrc file, (located in your root project directory) you will need to add the module-resolver field and since we’re having all our files in the /src directory, you need to set the root to src .
  • babelrc file should look like this:Now lets install react-navigation and styled-components by going into the terminal and navigating to the root project directory, and type in `npm install react-navigation — save“npm install Constants and Dummy DataIn our constants/styles.
  • js like so:In your we can now use our HomeScreen and ShowDetailsScreen in the StackNavigator config, like so:Now lets go back to our App.js file, and create a DrawerNavigator with the HomeStackNavigator as it’s initial route in the drawerRouterConfigNow hopefully if everything works, you should be able to load up the iOS simulator and you should see something like this:All the code is has been a really great tool to bootstrap my react native projects and I highly recommend it to anyone that wants to quickly get started with React Native.

In your /src/components/navigation/home-stack-navigator.js , we can now use our HomeScreen and ShowDetailsScreen in the StackNavigator config, like so: Now lets go back to our App.js file, and create…
Continue reading “Getting Started with Expo React Native and Styled Components using a Netflix Clone example”

Announcing “Advanced React.js Online” – componentDidBlog

  • I can barely believe it, but over the last 27 months, Michael Jackson and I have taught React.js to over 3,500 developers at 91 workshops in eight different countries (Oh, and one prison cell in the UK, but I digress…)In that time we’ve learned a lot about React as library authors, product developers, and teachers.We’re the authors of React Router.
  • That kind of usage has exposed us to the needs of all sorts of applications, pushing us to find simpler ways to be composable as library authors.Meeting 3,500 people (and more at the meet ups we regularly attend in our travels) has put a lot of code in front of us, and even more conversations about code.
  • It’s been fun learning the unique needs of all these people and figuring out ways to make their code better as product developers.Teaching React to thousands has helped us identify parts that some folks initially struggle with.
  • This helps us refine our material one workshop at a time, making us much better teachers.We’re happy to announce that we’re bringing all of that experience to you in our very first online course: Advanced React.js.
  • We’ve taken the most popular parts of our workshops and put them into the course.

Ninety-one. I can barely believe it, but over the last 27 months, Michael Jackson and I have taught React.js to over 3,500 developers at 91 workshops in eight different countries (Oh, and one prison…
Continue reading “Announcing “Advanced React.js Online” – componentDidBlog”

How to make Jenkins build NodeJS, Ruby, and Maven on Docker

How To Make Jenkins Build #NodeJS, #Ruby, And #Maven On #Docker  #reactjs #devops

  • Jenkins can speed up repetitive tasks that robots are much better performing and Docker simplifies spinning up VM’s for your application in a very simple and repeatable way.
  • What I’m going to cover here is how to setup your Jenkins server in a Docker container and how to fix some of the limits of the Jenkins official image.
  • While it is nice of Jenkins to offer an official docker image, you’ll quickly run into a few issues if you are doing anything even slightly other than compiling plain Java.
  • For instance, Maven, the popular dependency management tool for Java, is not included in the Docker container.
  • You’ve learned to create a Jenkins Docker-container that’s ready to run jobs for Java, Ruby and NodeJS projects.

Setup your Jenkins server in a Docker container and fix some of the limits of the official image I discovered so you can get up and building faster.
Continue reading “How to make Jenkins build NodeJS, Ruby, and Maven on Docker”

Getting Started with Expo React Native and Styled Components using a Netflix Clone example

  • For me it opens it up in Sublime Text Editor, but since I prefer to use WebStorm, I just manually find where the project was created and open the project using WebStorm.Your initial project structure should look something like this:Lets create the basic outline of this project, I like to structure my project by putting everything in a ‘src’ folder and then splitting it up from there.Create a ‘src’ folder and create a ‘components’ directory with a ‘common’, ‘navigation’, and ‘screens’ subdirectories.
  • I also use this file to store any other common styling attributes such as font sizes, and padding/margin lengths.You will also need to grab some images and put them in your /assets directory or you can use the ones that I picked out in here: grab the Netflix logo icon and put it under assets/icons directory: project directory structure should look something like this:I like to make my imports look clean, so I would recommend that you install the babel plugin `babel-plugin-module-alias`.
  • babelrc file, (located in your root project directory) you will need to add the module-resolver field and since we’re having all our files in the /src directory, you need to set the root to src .
  • babelrc file should look like this:Now lets install react-navigation and styled-components by going into the terminal and navigating to the root project directory, and type in `npm install react-navigation — save“npm install Constants and Dummy DataIn our constants/styles.
  • js like so:In your we can now use our HomeScreen and ShowDetailsScreen in the StackNavigator config, like so:Now lets go back to our App.js file, and create a DrawerNavigator with the HomeStackNavigator as it’s initial route in the drawerRouterConfigNow hopefully if everything works, you should be able to load up the iOS simulator and you should see something like this:All the code is has been a really great tool to bootstrap my react native projects and I highly recommend it to anyone that wants to quickly get started with React Native.

In your /src/components/navigation/home-stack-navigator.js , we can now use our HomeScreen and ShowDetailsScreen in the StackNavigator config, like so: Now lets go back to our App.js file, and create…
Continue reading “Getting Started with Expo React Native and Styled Components using a Netflix Clone example”

Moving Large Scale Web Apps to React

WEDNESDAY || Brooklyn, NY: Moving Large Scale Web Apps to @reactjs

#javascript #meetup

  • Just like with any production app, you may now be in the perpetual maintenance cycle or the app is still being used but is begging to be rewritten.
  • In this talk, Stan Bershadskiy will take you through the process of taking an existing application and porting it over to a more modern technology stack, such as React.
  • Stay after Stan’s talk and chat with current organizers to discuss open roles and responsibilities.
  • Stan Bershadskiy is an architect at Modus Create and specializes in all things JavaScript with vast knowledge in Sencha frameworks.
  • Stan is located in New York City and can be found co-organizing NYC.JS Meetups and presenting at conferences and meetups around the country.

Through the years we’ve built countless web apps using our favorite front-end tooling at the time. Just like with any production app, you may now be in the
Continue reading “Moving Large Scale Web Apps to React”

How HBO’s Silicon Valley built “Not Hotdog” with mobile TensorFlow, Keras & React Native

  • The depth openness of the deep learning community, and the presence of talented minds like R.C. is what makes deep learning viable for applications today — but they also make working in this field more thrilling than any tech trend we’ve been involved with.Our final architecture ended up making significant departures from the MobileNets architecture or from convention, in particular:We do not use Batch Normalization Activation between depthwise and pointwise convolutions, because the XCeption paper (which discussed depthwise convolutions in detail) seemed to indicate it would actually lead to less accuracy in architecture of this type (as helpfully pointed out by the author of the QuickNet paper on Reddit).
  • While this is a subject of some debate these days, our experiments placing BN after activation on small networks failed to converge as well.To optimize the network we used Cyclical Learning Rates and (fellow student) Brad Kenstler’s excellent Keras implementation.
  • This was hard to defend against as a) there just aren’t that many photographs of hotdogs in soft focus (we get hungry just thinking about it) and b) it could be damaging to spend too much of our network’s capacity training for soft focus, when realistically most images taken with a mobile phone will not have that feature.
  • Of the remaining 147k images, most were of food, with just 3k photos of non-food items, to help the network generalize a bit more and not get tricked into seeing a hotdog if presented with an image of a human in a red outfit.Our data augmentation rules were as follows:We applied rotations within Âą135 degrees — significantly more than average, because we coded the application to disregard phone orientation.Height and width shifts of 20%Shear range of 30%Zoom range of 10%Channel shifts of 20%Random horizontal flips to help the network generalizeThese numbers were derived intuitively, based on experiments and our understanding of the real-life usage of our app, as opposed to careful experimentation.The final key to our data pipeline was using Patrick Rodriguez’s multiprocess image data generator for Keras.
  • Phase 2 ran for 64 more epochs (4 CLR cycles with a step size of 8 epochs), with a learning rate between 0.0004 and 0.0045, on a triangular 2 policy.Phase 3 ran for 64 more epochs (4 CLR cycles with a step size of 8 epochs), with a learning rate between 0.000015 and 0.0002, on a triangular 2 policy.UPDATED: a previous version of this chart contained inaccurate learning rates.While learning rates were identified by running the linear experiment recommended by the CLR paper, they seem to intuitively make sense, in that the max for each phase is within a factor of 2 of the previous minimum, which is aligned with the industry standard recommendation of halving your learning rate if your accuracy plateaus during training.In the interest of time we performed some training runs on a Paperspace P5000 instance running Ubuntu.

How Silicon Valley build the real AI app that identifies hotdogs — and not hotdogs using mobile TensorFlow, Keras & React Native.
Continue reading “How HBO’s Silicon Valley built “Not Hotdog” with mobile TensorFlow, Keras & React Native”