Polymorphism In Programming

DevOps CI/CD class starts:

Get a $50.00 OFF, if you enroll now. 

1 Year Subscription

Polymorphism is defined as an object that can take on various forms. This article will look at polymorphisms and how they’re used in programming.

What is a polymorphism?

At its base level, a polymorphism is part of mathematic type theory. In computer science, a polymorphic object is an object that is capable of taking on multiple forms. The kind of polymorphism the object undergoes depends on when the object takes its form and what part of the object is transforming.

polymorphismpolymorphism image

When the object transforms:

  1. Compile-time
  2. Dynamic

What does the transforming:

  1. Method
  2. Object

Polymorphism in programming

“In programming languages and type theory, polymorphism is the provision of a single interface to entities of different types, or the use of a single symbol to represent multiple different types.”

Polymorphism is essential to object-oriented programming (OOP). Objects are defined as classes. They can have properties and methods. For example, we could create an object defined as class Car….



UCSD DevOps CICD

Continue reading on source link

Leave a Reply

Your email address will not be published. Required fields are marked *

error

Enjoy this blog? Please spread the word :)