Oh I don't consider myself an expert in math, merely half-educated, as I only had basic engineering math and statistic classes. I'm glad that your last post is much more reasonable than the one I quoted, seems like you're a reasonable person after all. Indeed, discreet mathematics could be called a sub-field, but's it's also one of the fields of mathematics that influenced computer science the most, especially in the early days of computing theory. It uses many symbols, and cluttering formulas with 3-letter notations like AND would make already big formulas too big in the end. Indeed the more "pure" mathematical circles might not use that notation, but in their own subfield they use it exclusively, and would find the use of and in their formulas as otherworldly as you do ^. In my opinion, the use of one character that is included in the default 255 ascii character table and doesn't mean anything else than and is the correct way to to write it, especially for computer science subjects since discreet mathematics is useful for "computer algorithms, programming languages, cryptography, automated theorem proving, and software development."(from wikipedia)