Skip to content

2019

Best of Python 3 f-strings

This piece is primarily meant for those new to Python. These include mathematicians, economists, and so on who want to use Python within a Jupyter environment. Here is a quick guide on how to make Best of Jupyter.

Quick Primer

If you are familiar with earlier Python versions, here are my top picks on how to move from .format () to this new one:

{{< highlight python >}}

_fstring = f'Total: {one + two}' # Go f-string! _format = 'Total: {}'.format(one + two) _percent = 'Total: %s' % (one + two) _concatenation = 'Total: ' + str(one + two) assert _fstring == _format == _percent == _concatenation {{< /highlight >}}

f-string Magic

f-strings are how you should use print statements in Python. It is fairly reminiscent of LaTeX in it’s inline notation: {{< highlight python >}}

inline variables, similar to LaTeX

name = "Fred" print(f"He said his name is {name}.")

'He said his name is Fred.'

{{< /highlight >}}

Notice how the variable name can now be used inline. This is a simple and easy to use syntax: just include the variable in surrounding {} while marking the string type as f-string using the ‘f’ in the beginning.

Note to the advanced programmer:

‘f’ may be combined with ‘r’ to produce raw f-string which can be used inside regex or similar functions. ‘f’ may not be combined with ‘u’, this is because all Python3.6+ strings are Unicode by default now. This means, you can write fstrings in Hindi, Chinese, French, Korean and atleast 10 other languages.

You can write fstrings in Hindi, Chinese, French, Korean and any language covered by Unicode.

But why are these called formatted-strings in the first place? Because you can use with some cool formatting hacks.

Simplified Alignment and Spacing

Have you ever tried creating a table such as that for logging or visualization? Arranging the elements becomes a nightmare with several \t tab characters flying around.

This is much easier with Python f-strings using the colon ‘:’ operator, followed by a an alignment operator and field width value.

There are atleast 3 alignment operator: < for left aligned, > for right aligned, and ^ for center aligned. Refer the code example:

{{< highlight python >}} correct = 'correct' phonetic_correct = 'phonetic_correct' typo = 'typo' phonetic_typo = 'phonetic_typo' phonetic_distance = 'phonetic_distance'

{{< /highlight >}} {{< highlight python >}}

print(f'No Spacing:') print(f'{correct}|{phonetic_correct}|{typo}|{phonetic_typo}|{phonetic_distance}|\n')

No Spacing:

correct|phonetic_correct|typo|phonetic_typo|phonetic_distance|

{{< /highlight >}} {{< highlight python >}}

print(f'Right Aligned:') print(f'{correct:>10}|{phonetic_correct:>20}|{typo:>10}|{phonetic_typo:>20}|{phonetic_distance:>20}|\n')

Right Aligned:

correct| phonetic_correct| typo| phonetic_typo| phonetic_distance|

{{< /highlight >}} {{< highlight python >}}

print(f'Left Aligned:') print(f'{correct:<10}|{phonetic_correct:<20}|{typo:<10}|{phonetic_typo:<20}|{phonetic_distance:<20}|\n')

Left Aligned:

correct |phonetic_correct |typo |phonetic_typo |phonetic_distance |

{{< /highlight >}} {{< highlight python >}}

print(f'Centre Aligned:') print(f'{correct:10}|{phonetic_correct:20}|{typo:10}|{phonetic_typo:20}|{phonetic_distance:^20}|')

Centre Aligned:

correct | phonetic_correct | typo | phonetic_typo | phonetic_distance |

{{< /highlight >}}

You also have support for decimal truncation and similar standard formatting utilities: {{< highlight python >}}

auto-resolve variable scope when nested

width = 10 precision = 4 value = decimal.Decimal("12.34567") print(f"result: {value:{width}.{precision}}") # nested fields

'result: 12.35'

{{< /highlight >}}

You might notice something interesting here: width and precision are automatically picked up from the scope. This means you can calculate width and precision using screen width or other inputs from system and use that.

Full Python Expressions Support

The above is only possible because the expression inside {} is actually being evaluated, or in programming terms: being executed.

This implies, that you can make any function call from within those {}.

Though, you should avoid doing this in practice very often because it might make your debugging very difficult. Instead, store the returned value from function in a variable and then add the variable in a fstring print statement.

Those coming from functional programming might miss their lambda functions. Don’t worry, Python has you covered:

Lambda Functions in f-strings

{{< highlight python >}}

If you feel you must use lambdas, they may be used inside of parentheses:

print(f'{(lambda x: x*3)(3)}')

'9'

note that this returned a and not

{{< /highlight >}}

Summary

  • f strings mean you can include variables and function calls inside your print statements
  • Inline variables: these are easier to read and debug for the developer
  • Use f-strings when you can!

The Silent Rise of PyTorch Ecosystem

While Tensorflow has made peace with Keras as it’s high level API and mxNet now support Gluon — PyTorch is the bare matrix love.

PyTorch has seen rapid adoption in academia and all the industrial labs that I have spoken to as well. One of the reasons people (specially engineers doing experiments) like PyTorch is the ease of debugging.

What I don’t like about PyTorch is it’s incessant requirement of debugging because of inconsistent dimensions problems. In fact, one of the most recommended speed hacks for faster development: assert tensor shapes!

This is something which Keras abstracts out really well. Additionally, PyTorch has no high level abstractions which picks good defaults for most common problems.

This leads us to the observation that there are three niche problems unsolved in the PyTorch ecosystem:

Unsolved Problems

  • General Purpose Abstraction: Over PyTorch similar to Keras or tf.learn
  • Adoption: Something to help traditional ML practitioners adopt PyTorch more easily
  • Production Use Cases: Something which allows engineers to take Pytorch code as-is in production or port to Caffe2 with minimal effort. I like Gluon for this, it has no community support but is backed by MSFT and AWS both.

Few specialized efforts like AllenAI’s NLP though built for NLP, or PyTorch torchvision & torchtext are domain specific instead of a generic abstraction similar to Keras. They deserve their own discussion space, separate from here.

The Better Alternatives

fast.ai

fastai has outrageously good defaults for both vision and NLP. They have several amazing implementations for Cyclic Learning Rate, learning rate schedulers, data augmentation, decent API design, interesting dataloaders, and most important: extremely extensible!

It as seen some rather great adoption among Kagglers and beginners alike for faster experimentation. It is also helped by their amazing MOOC course.

Ignite

Ignite helps you write compact but full-featured training loops in a few lines of code. It is fairly extensible, and results in a lot of compact code. There is no peeking under the hood. This is the best contender for Keras for PyTorch power users.

I do not know of any power users of Ignite, despite their elegant design. Nor have I seen it’s adoption in the wild.

PTL: PyTorch-Lightning

Built by folks over at NYU and FAIR, Lightning is gives you the skeleton to flesh our your experiments. The best contender to Keras for Researchers. The built in mixed precision support (via apex) and distributed training is definitely helpful.

The biggest value add I guess will be explicit decision, all in one class— instead of the scattered pieces we see with PyTorch. Yay Reproducibility!

The lib is still very new, and that shows up in it’s lack of adoption but is getting a lot of star counts in first week of launch!

Check out detailed comparison between Lightning and Ignite from the creator of Lightning

Skorch

skorch is attacking the bringing ML people to Deep Learning problem above

skorch is a scikit-learn style wrapper (with metrics and pipelines support!) for Pytorch by a commercial entity invested in it’s adoption. It is being developed fairly actively (most recent master commit is less than 15 days old) and marching to v1.

Summary

fast.ai: researchers, rapid iterators like Kagglers skorch: welcome people coming from more traditional Machine learning backgrounds PyTorch Lightning: custom built for DL experts looking for experimentation tooling

Ending Note: What are using for deep learning experiments? Have you seen the light with PyTorch or still going with Tensorflow? Tell me @nirantk

Tech Talk Tips

Collection of the Best Advice on Internet that I know about on giving a tech talk. Based on responses from my question on Twitter.

{{< figure src="/images/meghanaTalk.jpg" caption="Meghana gave a talk based on these tips at PyData Bengaluru" >}}

You: Hey, I know something better!

Me: Please tell me about it! Raise a PR. Or reply to the tweet above!

The Mindset

How to structure and style the talk?

Everyone's experiences are different so not sure how well mine generalise. But: find someone whose style of presentation you like & take some inspiration. There are soo many different ways of delivering a talk & it's all about finding the one that works best for your personality.

IMO, slides shouldn't contain everything that's being said in the talk. They should provide an overview of the main points and complement them visually – e.g. with diagrams, illustrations, small code examples. Also, people LOVE taking pics of slides.

I typically practice my talks in logical units, then practice the transitions, then put it all together at the end. I find that much easier and more efficient than only ever doing full run-throughs. But then again, people have different preferences here.

  • From Ines, the co-creator of spaCy

On how to start the talk:

Give an outline to the audience about the topics that will be addressed. More importantly, gauge the existing level of understanding of the subject of the audience and modify the presentation accordingly. Best way is to ask questions before starting the presentation.

On what is important

Holding the attention of the audience is v important. You are easier to pay attention to when you are * entertaining to the audience * telling a story * telling the truth

For every point of truth you want to cover, wrap it with an entertaining story arc. One are per point.

List all your points. Then list all the stories you could tell per point. Then sort the stories by entertainment value to the audience. Assume one story arc is approx per 10 mins of speaking. A mental model of "telling a fun story to someone you've met a few times" helps. - From Sidu Ponappa (@ponappa), GoJek India MD

Don’t fear live coding or live demos

For technical talks, I’m a big fan of live coding or live demos, as opposed to presenting all the material on slides. Audiences like live coding better than slides, because they actually get to see how the system works, as opposed to looking at static snapshots of a perfect working system. Have all my live coding talks gone perfectly? No, but I’ve found audiences to be very understanding when things go awry. Everyone knows that there is risk involved, and I’ve even had people tell me they learned more when they watched the recovery from (or explanation of) an error than they would have by watching glossy slides fly by. Audience attention is definitely higher for live coding talks.

More good