MarseyWorld/files/routes/notifications.py

331 lines
9.7 KiB
Python
Raw Normal View History

[DO NOT MERGE] import detanglation (#442) * move Base definition to files.classes.__init__.py * fix ImportError * move userpage listing to users.py * don't import the app from classes * consts: set default values to avoid crashes consts: warn if the secret key is the default config value * card view: sneed (user db schema) * cloudflare: use DEFAULT_CONFIG_VALUE * const: set default values * decouple media.py from __main__ * pass database to avoid imports * import cleanup and import request not in const, but in the requests mega import * move asset_submissions site check to __init__ * asset submissions feature flag * flag * g.is_tor * don't import request where it's not needed * i think this is fine * mail: move to own routes and helper * wrappers * required wrappers move * unfuck wrappers a bit * move snappy quotes and marseys to stateful consts * marsify * :pepodrool: * fix missing import * import cache * ...and settings.py * and static.py * static needs cache * route * lmao all of the jinja shit was in feeds.py amazing * classes should only import what they need from flask * import Response * hdjbjdhbhjf * ... * dfdfdfdf * make get a non-required import * isort imports (mostly) * but actually * configs * reload config on import * fgfgfgfg * config * config * initialize snappy and test * cookie of doom debug * edfjnkf * xikscdfd * debug config * set session cookie domain, i think this fixes the can't login bug * sdfbgnhvfdsghbnjfbdvvfghnn * hrsfxgf * dump the entire config on a request * kyskyskyskyskyskyskyskyskys * duifhdskfjdfd * dfdfdfdfdfdfdfdfdfdfdfdf * dfdfdfdf * imoprt all of the consts beacuse fuck it * 😭 * dfdfdfdfdfdfsdasdf * print the entire session * rffdfdfjkfksj * fgbhffh * not the secret keys * minor bug fixes * be helpful in the warning * gfgfgfg * move warning lower * isort main imports (i hope this doesn't fuck something up) * test * session cookie domain redux * dfdfdfd * try only importing Flask * formkeys fix * y * :pepodrool: * route helper * remove before flight * dfdfdfdfdf * isort classes * isort helpers * move check_for_alts to routehelpers and also sort imports and get rid of unused ones * that previous commit but actkally * readd the cache in a dozen places they were implicitly imported * use g.is_tor instead of request.headers. bla bla bla * upgrade streamers to their own route file * get rid of unused imports in __main__ * fgfgf * don't pull in the entire ORM where we don't need it * features * explicit imports for the get helper * explicit imports for the get helper redux * testing allroutes * remove unused import * decouple flask from classes * syntax fix also remember these have side fx for some reason (why?) * move side effects out of the class * posts * testing on devrama * settings * reloading * settingssdsdsds * streamer features * site settings * testing settings on devrama * import * fix modlog * remove debug stuff * revert commit 67275b21ab6e2f2520819e84d10bfc1c746a15b6 * archiveorg to _archiveorg * skhudkfkjfd * fix cron for PCM * fix bugs that snekky wants me to * Fix call to realbody passing db, standardize kwarg * test * import check_for_alts from the right place * cloudflare * testing on devrama * fix cron i think * shadow properly * tasks * Remove print which will surely be annoying in prod. * v and create new session * use files.classes * make errors import little and fix rare 500 in /allow_nsfw * Revert "use files.classes" This reverts commit 98c10b876cf86ce058b7fb955cf1ec0bfb9996c6. * pass v to media functions rather than using g * fix * dfdfdfdfd * cleanup, py type checking is dumb so don't use it where it causes issues * Fix some merge bugs, add DEFAULT_RATELIMIT to main. * Fix imports on sqlalchemy expressions. * `from random import random` is an error. * Fix replies db param. * errors: fix missing import * fix rare 500: only send to GIFT_NOTIF_ID if it exists, and send them the right text * Fix signup formkey. * fix 2 500s * propagate db to submissions * fix replies * dfdfdfdf * Fix verifiedcolor. * is_manual * can't use getters outside of an app context * don't attempt to do gumroad on sites where it's not enabled * don't attempt to do gumraod on sites's where it's unnecessary * Revert "don't attempt to do gumroad on sites where it's not enabled" This reverts commit 6f8a6331878655492dfaf1907b27f8be513c14d3. * fix 500 * validate media type Co-authored-by: TLSM <duolsm@outlook.com>
2022-11-15 09:19:08 +00:00
import time
from sqlalchemy.sql.expression import not_, and_, or_
from files.classes.mod_logs import ModAction
from files.classes.sub_logs import SubAction
2022-07-08 18:06:54 +00:00
from files.helpers.const import *
[DO NOT MERGE] import detanglation (#442) * move Base definition to files.classes.__init__.py * fix ImportError * move userpage listing to users.py * don't import the app from classes * consts: set default values to avoid crashes consts: warn if the secret key is the default config value * card view: sneed (user db schema) * cloudflare: use DEFAULT_CONFIG_VALUE * const: set default values * decouple media.py from __main__ * pass database to avoid imports * import cleanup and import request not in const, but in the requests mega import * move asset_submissions site check to __init__ * asset submissions feature flag * flag * g.is_tor * don't import request where it's not needed * i think this is fine * mail: move to own routes and helper * wrappers * required wrappers move * unfuck wrappers a bit * move snappy quotes and marseys to stateful consts * marsify * :pepodrool: * fix missing import * import cache * ...and settings.py * and static.py * static needs cache * route * lmao all of the jinja shit was in feeds.py amazing * classes should only import what they need from flask * import Response * hdjbjdhbhjf * ... * dfdfdfdf * make get a non-required import * isort imports (mostly) * but actually * configs * reload config on import * fgfgfgfg * config * config * initialize snappy and test * cookie of doom debug * edfjnkf * xikscdfd * debug config * set session cookie domain, i think this fixes the can't login bug * sdfbgnhvfdsghbnjfbdvvfghnn * hrsfxgf * dump the entire config on a request * kyskyskyskyskyskyskyskyskys * duifhdskfjdfd * dfdfdfdfdfdfdfdfdfdfdfdf * dfdfdfdf * imoprt all of the consts beacuse fuck it * 😭 * dfdfdfdfdfdfsdasdf * print the entire session * rffdfdfjkfksj * fgbhffh * not the secret keys * minor bug fixes * be helpful in the warning * gfgfgfg * move warning lower * isort main imports (i hope this doesn't fuck something up) * test * session cookie domain redux * dfdfdfd * try only importing Flask * formkeys fix * y * :pepodrool: * route helper * remove before flight * dfdfdfdfdf * isort classes * isort helpers * move check_for_alts to routehelpers and also sort imports and get rid of unused ones * that previous commit but actkally * readd the cache in a dozen places they were implicitly imported * use g.is_tor instead of request.headers. bla bla bla * upgrade streamers to their own route file * get rid of unused imports in __main__ * fgfgf * don't pull in the entire ORM where we don't need it * features * explicit imports for the get helper * explicit imports for the get helper redux * testing allroutes * remove unused import * decouple flask from classes * syntax fix also remember these have side fx for some reason (why?) * move side effects out of the class * posts * testing on devrama * settings * reloading * settingssdsdsds * streamer features * site settings * testing settings on devrama * import * fix modlog * remove debug stuff * revert commit 67275b21ab6e2f2520819e84d10bfc1c746a15b6 * archiveorg to _archiveorg * skhudkfkjfd * fix cron for PCM * fix bugs that snekky wants me to * Fix call to realbody passing db, standardize kwarg * test * import check_for_alts from the right place * cloudflare * testing on devrama * fix cron i think * shadow properly * tasks * Remove print which will surely be annoying in prod. * v and create new session * use files.classes * make errors import little and fix rare 500 in /allow_nsfw * Revert "use files.classes" This reverts commit 98c10b876cf86ce058b7fb955cf1ec0bfb9996c6. * pass v to media functions rather than using g * fix * dfdfdfdfd * cleanup, py type checking is dumb so don't use it where it causes issues * Fix some merge bugs, add DEFAULT_RATELIMIT to main. * Fix imports on sqlalchemy expressions. * `from random import random` is an error. * Fix replies db param. * errors: fix missing import * fix rare 500: only send to GIFT_NOTIF_ID if it exists, and send them the right text * Fix signup formkey. * fix 2 500s * propagate db to submissions * fix replies * dfdfdfdf * Fix verifiedcolor. * is_manual * can't use getters outside of an app context * don't attempt to do gumroad on sites where it's not enabled * don't attempt to do gumraod on sites's where it's unnecessary * Revert "don't attempt to do gumroad on sites where it's not enabled" This reverts commit 6f8a6331878655492dfaf1907b27f8be513c14d3. * fix 500 * validate media type Co-authored-by: TLSM <duolsm@outlook.com>
2022-11-15 09:19:08 +00:00
from files.helpers.get import *
from files.routes.wrappers import *
2022-07-08 18:06:54 +00:00
from files.__main__ import app
@app.post("/clear")
@auth_required
@ratelimit_user()
2022-07-08 18:06:54 +00:00
def clear(v):
notifs = g.db.query(Notification).join(Notification.comment).filter(Notification.read == False, Notification.user_id == v.id).all()
for n in notifs:
n.read = True
g.db.add(n)
v.last_viewed_post_notifs = int(time.time())
2022-08-05 21:50:30 +00:00
v.last_viewed_log_notifs = int(time.time())
g.db.add(v)
2022-09-24 00:00:43 +00:00
return {"message": "Notifications marked as read!"}
2022-07-08 18:06:54 +00:00
@app.get("/unread")
@auth_required
@ratelimit_user()
2022-07-08 18:06:54 +00:00
def unread(v):
listing = g.db.query(Notification, Comment).join(Notification.comment).filter(
Notification.read == False,
Notification.user_id == v.id,
Comment.is_banned == False,
Comment.deleted_utc == 0,
).order_by(Notification.created_utc.desc()).all()
for n, c in listing:
n.read = True
g.db.add(n)
2022-11-15 12:52:17 +00:00
return {"data":[x[1].json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
@app.get("/notifications/modmail")
2022-10-06 03:10:57 +00:00
@admin_level_required(PERMS['VIEW_MODMAIL'])
2022-07-08 18:06:54 +00:00
def notifications_modmail(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
comments = g.db.query(Comment).filter(Comment.sentto==2).order_by(Comment.id.desc()).offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
next_exists = (len(comments) > PAGE_SIZE)
listing = comments[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
2022-07-08 19:42:40 +00:00
g.db.commit()
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/messages")
@auth_required
def notifications_messages(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
# All of these queries are horrible. For whomever comes here after me,
# PLEASE just turn DMs into their own table and get them out of
# Notifications & Comments. It's worth it. Save yourself.
message_threads = g.db.query(Comment).filter(
Comment.sentto != None,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
Comment.parent_submission == None,
Comment.level == 1,
)
if not v.shadowbanned and v.admin_level < PERMS['NOTIFICATIONS_FROM_SHADOWBANNED_USERS']:
message_threads = message_threads.join(Comment.author) \
.filter(User.shadowbanned == None)
thread_order = g.db.query(Comment.top_comment_id, Comment.created_utc) \
.distinct(Comment.top_comment_id) \
.filter(
Comment.sentto != None,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
).order_by(
Comment.top_comment_id.desc(),
Comment.created_utc.desc()
).subquery()
message_threads = message_threads.join(thread_order,
thread_order.c.top_comment_id == Comment.top_comment_id)
message_threads = message_threads.order_by(thread_order.c.created_utc.desc()) \
.offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
# Clear notifications (used for unread indicator only) for all user messages.
notifs_unread_row = g.db.query(Notification.comment_id).join(Comment).filter(
Notification.user_id == v.id,
Notification.read == False,
or_(Comment.author_id == v.id, Comment.sentto == v.id),
).all()
notifs_unread = [n.comment_id for n in notifs_unread_row]
g.db.query(Notification).filter(
Notification.user_id == v.id,
Notification.comment_id.in_(notifs_unread),
).update({Notification.read: True})
2022-07-08 19:42:40 +00:00
g.db.commit()
next_exists = (len(message_threads) > 25)
listing = message_threads[:25]
2022-09-17 21:59:49 +00:00
list_to_perserve_unread_attribute = []
comments_unread = g.db.query(Comment).filter(Comment.id.in_(notifs_unread))
for c in comments_unread:
c.unread = True
2022-09-17 21:59:49 +00:00
list_to_perserve_unread_attribute.append(c)
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/posts")
@auth_required
def notifications_posts(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
listing = [x[0] for x in g.db.query(Submission.id).filter(
or_(
2022-07-08 19:10:01 +00:00
Submission.author_id.in_(v.followed_users),
Submission.sub.in_(v.followed_subs)
),
Submission.deleted_utc == 0,
Submission.is_banned == False,
Submission.private == False,
2022-09-10 07:00:45 +00:00
Submission.notify == True,
Submission.author_id != v.id,
2022-07-17 23:00:51 +00:00
Submission.ghost == False,
Submission.author_id.notin_(v.userblocks)
).order_by(Submission.created_utc.desc()).offset(PAGE_SIZE * (page - 1)).limit(PAGE_SIZE+1).all()]
2022-07-08 18:06:54 +00:00
next_exists = (len(listing) > 25)
listing = listing[:25]
listing = get_posts(listing, v=v, eager=True)
2022-07-08 18:06:54 +00:00
for p in listing:
p.unread = p.created_utc > v.last_viewed_post_notifs
v.last_viewed_post_notifs = int(time.time())
g.db.add(v)
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/modactions")
2022-11-08 13:49:43 +00:00
@auth_required
2022-07-08 18:06:54 +00:00
def notifications_modactions(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
2022-11-08 13:49:43 +00:00
if v.admin_level >= PERMS['NOTIFICATIONS_MODERATOR_ACTIONS']:
cls = ModAction
elif v.moderated_subs:
cls = SubAction
else:
abort(403)
listing = g.db.query(cls).filter(cls.user_id != v.id)
if cls == SubAction:
listing = listing.filter(cls.sub.in_(v.moderated_subs))
2022-07-08 18:06:54 +00:00
2022-11-08 13:49:43 +00:00
listing = listing.order_by(cls.id.desc()).offset(PAGE_SIZE*(page-1)).limit(PAGE_SIZE+1).all()
next_exists = len(listing) > PAGE_SIZE
listing = listing[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
2022-08-05 21:50:30 +00:00
for ma in listing:
ma.unread = ma.created_utc > v.last_viewed_log_notifs
2022-07-08 19:42:40 +00:00
2022-08-05 21:50:30 +00:00
v.last_viewed_log_notifs = int(time.time())
g.db.add(v)
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications/reddit")
@auth_required
def notifications_reddit(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
if not v.can_view_offsitementions: abort(403)
2022-07-10 14:09:41 +00:00
notifications = g.db.query(Notification, Comment).join(Notification.comment).filter(
Notification.user_id == v.id,
Comment.body_html.like('%<p>New site mention%<a href="https://old.reddit.com/r/%'),
Comment.parent_submission == None,
Comment.author_id == AUTOJANNY_ID
).order_by(Notification.created_utc.desc()).offset(25 * (page - 1)).limit(101).all()
2022-07-08 18:06:54 +00:00
listing = []
for index, x in enumerate(notifications[:100]):
n, c = x
if n.read and index > 24: break
elif not n.read:
n.read = True
c.unread = True
g.db.add(n)
if n.created_utc > 1620391248: c.notif_utc = n.created_utc
listing.append(c)
next_exists = (len(notifications) > len(listing))
2022-07-08 19:42:40 +00:00
g.db.commit()
2022-07-08 18:06:54 +00:00
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
2022-09-04 23:15:37 +00:00
)
2022-07-08 18:06:54 +00:00
@app.get("/notifications")
@auth_required
def notifications(v):
try: page = max(int(request.values.get("page", 1)), 1)
except: page = 1
comments = g.db.query(Comment, Notification).join(Notification.comment).join(Comment.author).filter(
2022-07-08 18:06:54 +00:00
Notification.user_id == v.id,
Comment.is_banned == False,
Comment.deleted_utc == 0,
2022-07-10 14:09:41 +00:00
Comment.body_html.notlike('%<p>New site mention%<a href="https://old.reddit.com/r/%'),
or_(Comment.sentto == None, Comment.sentto == MODMAIL_ID),
not_(and_(Comment.sentto != None, Comment.sentto == MODMAIL_ID, User.is_muted)),
)
if v.admin_level < PERMS['USER_SHADOWBAN']:
comments = comments.filter(User.shadowbanned == None)
2022-07-08 18:06:54 +00:00
comments = comments.order_by(Notification.created_utc.desc())
comments = comments.offset(PAGE_SIZE * (page - 1)).limit(PAGE_SIZE+1).all()
2022-07-08 18:06:54 +00:00
next_exists = (len(comments) > PAGE_SIZE)
comments = comments[:PAGE_SIZE]
2022-07-08 18:06:54 +00:00
cids = [x[0].id for x in comments]
comms = get_comments(cids, v=v)
listing = []
for c, n in comments:
if n.created_utc > 1620391248: c.notif_utc = n.created_utc
if not n.read:
n.read = True
c.unread = True
g.db.add(n)
if c.parent_submission:
if c.replies2 == None:
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
2022-07-08 18:06:54 +00:00
for x in c.replies2:
if x.replies2 == None: x.replies2 = []
count = 0
2022-10-05 21:11:52 +00:00
while count < 50 and c.parent_comment and (c.parent_comment.author_id == v.id or c.parent_comment.id in cids):
2022-07-08 18:06:54 +00:00
count += 1
c = c.parent_comment
if c.replies2 == None:
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
2022-07-08 18:06:54 +00:00
for x in c.replies2:
if x.replies2 == None:
x.replies2 = g.db.query(Comment).filter_by(parent_comment_id=x.id).filter(or_(Comment.author_id == v.id, Comment.id.in_(cids))).order_by(Comment.id.desc()).all()
2022-07-08 18:06:54 +00:00
else:
while c.parent_comment:
c = c.parent_comment
c.replies2 = g.db.query(Comment).filter_by(parent_comment_id=c.id).order_by(Comment.id).all()
if c not in listing: listing.append(c)
2022-07-08 19:42:40 +00:00
g.db.commit()
2022-07-08 18:06:54 +00:00
2022-11-15 09:28:39 +00:00
if v.client: return {"data":[x.json(g.db) for x in listing]}
2022-07-08 18:06:54 +00:00
return render_template("notifications.html",
v=v,
notifications=listing,
next_exists=next_exists,
page=page,
standalone=True,
render_replies=True,
)