Wow. So we finally made it to Project week, which was the 3rd week of our cohort thus far. We were tasked with the challenge of building our first CLI Gem with Ruby. After 3 weeks of learning everything from string interpolations to object orientation, I figured “It can’t be THAT hard”. HA! So i thought….
First things first, I had to figure out what I wanted to create my gem for. After many brainstormed ideas, I decided to create a CLI centered around skincare products. In order to do this, I had to extrapolate data from a website using a method called web scraping so that I would be able to display that data in my CLI. So, how exactly did we do that?
I started off by looking into different gems already found in the ruby library that I could use to test, run, and scrape data from the website I was using. RubyGems.com was a great resource to use when it came to the specific gems I needed. Some of the gems I used were “pry” and “nokogiri”, which I used to scrape the data I needed as well as test and debug any issues I bumped into whilst running my code.
I was able to store the info for all these gems I installed in a file I created in my program called the “Gemfile”. Usually, if the gems are updated and you install them as a bundle using the terminal command, ‘bundle install’, it will create a separate file called “Gemfile.Lock” which will display all the all of the specs of the various gems you are using.
The next hurdle was creating my scraper class. This class included all of the urls for all of the products I wanted to feature in my gem. Using Nokogiri, it allowed me to access a whole body of information by just using the webpage’s URL. However, nothing prepared me for the work I had to do next. Scraping is a bit more time consuming and more specific than extracting information from something like an API(Application Programming Interface). An API, in layman’s terms, is a software ‘middle-man’ that allows two applications to talk to each other. Sites like Facebook and Twitter have API’s to be able to share info freely and with the public. Since I was extracting info from a specific website, scraping was a better option.
Next up: Figuring out my classes and their methods!
My classes were split into 4: “Body Cream, Body Lotion, Shower Gel, & Body Fragrance. It was in this method where I really was able to learn more about the difference between a Class method and an instance method. Also, I learn more about the importance of “self” when it comes to defining and instantiating a method within a particular class. If the method is called in a class, you can use the method “@@all” to be able to reference all instances of the class you are calling in your program.
The CLI class is where we brought all the methods together to be able to run the whole CLI Gem. It included a welcome message, a prompt for options, as well as a prompt for more info on the user’s option. At the end, the user can choose whether to explore other options or exit the gem. All in all, this was a challenging, yet fun experience. It was amazing to see how all the knowledge we attained over these 3 weeks finally came together. I can’t wait to see what new challenges await!